All Races, ethnicities, religions. Gay or straight, cis or trans: We don't care. If you can rock with us: You are one of us


Community Featured Submissions 🧅

  • Retraction: The Mystery Ashley is not the Original Ashley
Joshua Moon the owner of Kiwifarms
Josh wrote this long message on Telegram:
The restore has been a nightmare, complicated by the fact I am mid-packing for my trip in December before I return to the US. The TL;DR is that the import is running and I think I've sorted out any issues preventing it from restoring this time.

The Kiwi Farms's raw backup file is over 60GiB of pure text. Trying to restore this under any conditions is a slow and painful process. Our situation is complicated manifold by having to run our own hardware which I am solely responsible for at approximately 6000 miles away.

At the beginning of the year, our original server died while being handled by remote hands. Somehow, the motherboard died. I've known people who've spent 20 years handling servers who have never seen a motherboard die. People refused to believe me when I said that was the actual problem.

We bought a new server off eBay and moved all the regular SATA SSDs over to it. These are very high capacity SSDs that are raided for extra redundancy, because if I need one video of Destiny gobbling cock, I need it four times. This highly resilient setup has a trade-off: much higher read speeds, much slower write speeds.

This server (which identifies its mainboard as the H12DSi-N6) came off eBay with two different sized M.2 SSDs (they are fast). We did not raid these because raiding NVMe actually slows it down. Instead, the smaller was made a boot drive, and the larger was made a MySQL drive. This worked great.

The MySQL drive melted within the first month, marking the latest victim of the Kiwi Farms hardware curse. There are years of history of drives failing spontaneously and I swear to fucking god it's not my fault.

In the emergency, I reimported the database then to the storage RAID. I made the decision that putting an intensive process on the boot drive was a bad idea. This worked OK for a while, but eventually I had to start putting in fixes to reduce I/O. Having a very read-optimized RAID sharing work with the write-heavy database caused problems. I also had to adjust the database to write less - the draw off being that if there's a crash, there's more data loss if you write less.

You can probably detect where this is going. Add in several thousand people trying to download a video of a goblin sucking dick along with a database already struggling to handle normal traffic, and you've got the potential energy for a disaster. A table locked up (the alerts table, which is millions of rows long) and in trying to get it to unlock the InnoDB structure completely collapsed.

After trying to fix it, I decided I would re-import it. This is not an easy decision to come to. The import is very, very slow because you're writing hundreds of millions of rows into a slow RAID. At a certain point, the import crashed. I became impatient and tried against to import, this time into the boot NVMe thinking it would be temporary until I can replace the NVMe I should have replaced months ago. Attempt #2 didn't work. The database is now so big it does not fit on that drive. Attempt #3 was attempting again to import on the slow raid. That didn't work. I fixed something and now we're on Attempt #4. Each of these attempts is at least two hours of time.

As it turns out, the motherboard has spaces for four M.2 slots, when I thought it had only two (one spare). I misremembered because it only came with two. I could have fixed this with zero downtime months ago but I never did, in part because I've never actually seen this server, and not knowing what I have is a huge detriment to being able to proactively solve problems.

I very much look forward to having a server in driving distance at some point in my near future, but even that has its own slew of problems because I have no idea how I'm going to move it to the east coast without 2 continuous days of downtime, and I really don't want to spend 4 to 5 figures on a new one just to avoid that.

Computers suck. Become an electrician or plumber. Do not become a computer person.
TLDR - He fucked up. Expect another "upgrading the forum!" merch run.
 
Josh wrote this long message on Telegram:
The restore has been a nightmare, complicated by the fact I am mid-packing for my trip in December before I return to the US. The TL;DR is that the import is running and I think I've sorted out any issues preventing it from restoring this time.

The Kiwi Farms's raw backup file is over 60GiB of pure text. Trying to restore this under any conditions is a slow and painful process. Our situation is complicated manifold by having to run our own hardware which I am solely responsible for at approximately 6000 miles away.

At the beginning of the year, our original server died while being handled by remote hands. Somehow, the motherboard died. I've known people who've spent 20 years handling servers who have never seen a motherboard die. People refused to believe me when I said that was the actual problem.

We bought a new server off eBay and moved all the regular SATA SSDs over to it. These are very high capacity SSDs that are raided for extra redundancy, because if I need one video of Destiny gobbling cock, I need it four times. This highly resilient setup has a trade-off: much higher read speeds, much slower write speeds.

This server (which identifies its mainboard as the H12DSi-N6) came off eBay with two different sized M.2 SSDs (they are fast). We did not raid these because raiding NVMe actually slows it down. Instead, the smaller was made a boot drive, and the larger was made a MySQL drive. This worked great.

The MySQL drive melted within the first month, marking the latest victim of the Kiwi Farms hardware curse. There are years of history of drives failing spontaneously and I swear to fucking god it's not my fault.

In the emergency, I reimported the database then to the storage RAID. I made the decision that putting an intensive process on the boot drive was a bad idea. This worked OK for a while, but eventually I had to start putting in fixes to reduce I/O. Having a very read-optimized RAID sharing work with the write-heavy database caused problems. I also had to adjust the database to write less - the draw off being that if there's a crash, there's more data loss if you write less.

You can probably detect where this is going. Add in several thousand people trying to download a video of a goblin sucking dick along with a database already struggling to handle normal traffic, and you've got the potential energy for a disaster. A table locked up (the alerts table, which is millions of rows long) and in trying to get it to unlock the InnoDB structure completely collapsed.

After trying to fix it, I decided I would re-import it. This is not an easy decision to come to. The import is very, very slow because you're writing hundreds of millions of rows into a slow RAID. At a certain point, the import crashed. I became impatient and tried against to import, this time into the boot NVMe thinking it would be temporary until I can replace the NVMe I should have replaced months ago. Attempt #2 didn't work. The database is now so big it does not fit on that drive. Attempt #3 was attempting again to import on the slow raid. That didn't work. I fixed something and now we're on Attempt #4. Each of these attempts is at least two hours of time.

As it turns out, the motherboard has spaces for four M.2 slots, when I thought it had only two (one spare). I misremembered because it only came with two. I could have fixed this with zero downtime months ago but I never did, in part because I've never actually seen this server, and not knowing what I have is a huge detriment to being able to proactively solve problems.

I very much look forward to having a server in driving distance at some point in my near future, but even that has its own slew of problems because I have no idea how I'm going to move it to the east coast without 2 continuous days of downtime, and I really don't want to spend 4 to 5 figures on a new one just to avoid that.

Computers suck. Become an electrician or plumber. Do not become a computer person.
TLDR - He fucked up. Expect another "upgrading the forum!" merch run.
TLDR
Screenshot_20241130-225150.png.png
 
Somehow, the motherboard died. I've known people who've spent 20 years handling servers who have never seen a motherboard die.
>When you're such a failure at life that you achieve to break something no one else has broken
We bought a new server off eBay
7a5.jpg

the larger was made a MySQL drive. This worked great.

The MySQL drive melted within the first month
:story:
Only fucking Josh would have the kind of Midus touch where everything fucks up in a short amount of time.
As it turns out, the motherboard has spaces for four M.2 slots, when I thought it had only two (one spare). I misremembered because it only came with two. I could have fixed this with zero downtime months ago but I never did, in part because I've never actually seen this server, and not knowing what I have is a huge detriment to being able to proactively solve problems.
>not knowing what I have is a huge detriment to being able to proactively solve problems.
OH GEE WHIZ, YOU THINK!? Maybe instead of rushing shit you should take your fucking time with these sort of issues
Computers suck
no josh, YOU suck. You fucking suck so much with technology. You can't code for shit, your fucking TERRIBLE at maintaining a server, yet you act like you're the smartest guy in the room. You make other techies look bad, and you set a bad example for other people trying to learn by talking down to them and making them feel like shit because they don't know any better.

Either get your shit together and actually LEARN computers properly, or hand the torch to someone else because it's clear you are completely checked out and all you can do is go "LOL OOPSIE DOOPSIE, I MADE A FUCKIE-WUCKIE!"

And i can guaren-fucking-tee you that this will lead to the inevitable "PLEASE GIVE ME MONEYZ FOR MY BEEP BOOPS SO I CAN MAKE SITE WORK BETTER!" scam like all the other ones.
 
>When you're such a failure at life that you achieve to break something no one else has broken

View attachment 76074

:story:
Only fucking Josh would have the kind of Midus touch where everything fucks up in a short amount of time.

>not knowing what I have is a huge detriment to being able to proactively solve problems.
OH GEE WHIZ, YOU THINK!? Maybe instead of rushing shit you should take your fucking time with these sort of issues

no josh, YOU suck. You fucking suck so much with technology. You can't code for shit, your fucking TERRIBLE at maintaining a server, yet you act like you're the smartest guy in the room. You make other techies look bad, and you set a bad example for other people trying to learn by talking down to them and making them feel like shit because they don't know any better.

Either get your shit together and actually LEARN computers properly, or hand the torch to someone else because it's clear you are completely checked out and all you can do is go "LOL OOPSIE DOOPSIE, I MADE A FUCKIE-WUCKIE!"

And i can guaren-fucking-tee you that this will lead to the inevitable "PLEASE GIVE ME MONEYZ FOR MY BEEP BOOPS SO I CAN MAKE SITE WORK BETTER!" scam like all the other ones.
At this point every time KF goes down I know its because Jocelyn is having ersatz menstrual cramps which will always result in more e-begging.
 
These are very high capacity SSDs that are raided for extra redundancy, because if I need one video of Destiny gobbling cock, I need it four times.
Normal straight male Josh. Not an obese homosexual at all.
I liked the part when he shat on his site like an old pizza box so badly he broke it (again)!
 
Last edited:
He also forgot to add the part where baked Alaska Nick Fuentes and various other alt right figureheads encouraged their followers too attend the Jan 6 riots. This is significant because in all likelihood Nick cooperated with the feds and turned in some of his followers as well as any personal information he had on the rest of them.
I'm aware of the theory that Nick Fuentes is likely a fed plant.

Seeing how Nick Fuentes is unable to plead the 5th whenever he talks on camera and posts on social media; I'm pretty sure he is a fed plant where his only purpose is to make the Right look bad and give the Left plenty of material to work with.

Nick Fuentes glows brighter than the Sun.
 
I'm aware of the theory that Nick Fuentes is likely a fed plant.

Seeing how Nick Fuentes is unable to plead the 5th whenever he talks on camera and posts on social media; I'm pretty sure he is a fed plant where his only purpose is to make the Right look bad and give the Left plenty of material to work with.

Nick Fuentes glows brighter than the Sun.
It's been my suspicion that, leading up to the 2016 presidential election, alt-right media accounts across various platforms were permitted to spread false information and right-wing conspiracies with relative impunity. I believe that the federal government, in cooperation with social media conglomerates, may have allowed this to occur intentionally, effectively allowing the more outlandish and fringe elements to rise to prominence - individuals like Nick Fuentes or those promoting QAnon. By doing so, the more extreme and absurd narratives where being amplified.
 
Reason: Cut off jew fingers? Wtf?
How did I miss this :story: On your knees, boy!
View attachment 76076View attachment 76077View attachment 76078

Is he purposely not giving the post that got him banned on his KiwiNull account? His account with his cringe nazi larp profile picture?
View attachment 76079
Nigger had it coming. Turnabout is fair play.

I had nothing to do with this. Its just another example of karma playing out by happenstance every time he bans me. This time he gets to be waterboarded by actual spam.
 
Last edited:
I get the feeling Destiny is likely aware of what Jersh is doing. I'm no fan of Destiny (that guy is a trainwreck for a number of reasons, especially as he likes to see his wife getting plowed by different dudes) but Jersh is practically putting a 'Kick Me' sign on his back on the count that most of the U.S. states and territories have since criminalized Revenge Porn.

Jersh is just asking for it. Destiny is going to become aware that his private sex tapes are out there and it originated from Kiwi Farms, he's likely going to be contacting a lawyer, and he's likely notifying authorities that the biggest stalker site on the Internet his posting revenge porn featuring him. If this results in legal action, it will be the biggest fail being done by a far-right website owner since the time Andrew Anglin (the owner of the Daily Stormer) told his readers to harass a Jewish woman in Montana which resulted in him getting sued over call to action damages (Andrew Anglin lost the court case by default and he owes a lot of money to that woman now).
Guess Destiny doesn't care but others in the videos still might.
 
It's been my suspicion that, leading up to the 2016 presidential election, alt-right media accounts across various platforms were permitted to spread false information and right-wing conspiracies with relative impunity. I believe that the federal government, in cooperation with social media conglomerates, may have allowed this to occur intentionally, effectively allowing the more outlandish and fringe elements to rise to prominence - individuals like Nick Fuentes or those promoting QAnon. By doing so, the more extreme and absurd narratives where being amplified.
What would be the point of that? I think it's more likely speech wasn't being suppressed until after the 2016 election, and that's when the feds started collaborating in earnest with the social media to ban speech they didn't like.
 
Back
Top