queersatanic:

queersatanic:

queersatanic:

We know you want to burn down capitalism.

But for today, just don’t answer your boss’s call off the clock.

We know you believe in ACAB and think they all should get the wall.

But for today, just don’t call 9-1-1 on the guy screaming outside of your apartment.

The memes are fun. The memes are aspirational and keep us reaching for the horizon.

But look down, too, at what actually is.

Endure pains now—suffer the inconveniences now—knowing that they likely involve unpleasantness.

The Revolution™ is fun to imagine and involves no pain. But the real world does involve pain, and it’s necessary to exercise the muscles needed for future work and opportunities.

(via anarcho-smarmyism)

9:07 pm  •  28 June 2023  •  16,964 notes

traycakes:

catchymemes:

image

I want to make this absolutely clear to kids: children didn’t used to be stuck inside the house like you are today. There used to be public places you could hang out. It used to be fairly safe to walk around because trucks weren’t designed to kill children. You didn’t need a car to go anywhere so kids without a license weren’t trapped. There weren’t 24/7 cable news networks constantly scaring parents with anecdotes even as crime was at all time lows and the biggest danger comes from adults kids know not strangers.

It’s easy to ignore old people talking about “the good ol’ days” because a lot of the people saying that shit are racist assholes, but the way society treats kids today really is objectively worse than how kids used to be treated. You deserve better, and you should know that better things are possible. We just need to kill the suburbs and for-profit news.

(via justaddfiction)

9:07 pm  •  28 June 2023  •  26,994 notes

bobolobocus:

liberalsarecool:

minmaneth:

bellybuttonblue2:

image

just sayin’

This should be taught in school.

image

(via justaddfiction)

9:06 pm  •  28 June 2023  •  44,828 notes

archipelagoofliterarynonsense:

image

I wanted to put a more positive spin on the popular skeleton leaving meme

(via justaddfiction)

9:06 pm  •  28 June 2023  •  79,518 notes

nudityandnerdery:

nerdygaymormon:

image

Well, you know what….

image

(via justaddfiction)

9:06 pm  •  28 June 2023  •  7,171 notes

sillyphantom:

bruh

image
image
image

(via anarcho-smarmyism)

9:05 pm  •  28 June 2023  •  15,268 notes

wigdevil:

image

If you can’t wash it off, paint over it, replace the item, or buff it out, turn a message of hate into one of love!

I would never condone someone to do this discreetly and in mere seconds with a quickly concealed permanent marker, for example on a public bench or bus stop. Certainly not anything like whipping out a tat machine and adding to an unconscious white supremacist’s existing tattoo. That would be illegal! :) And, dear followers, I would never encourage you to do something that’s illegal.

So, please only use this when someone has defaced your personal property to avoid breaking the law! Because that would be illegal, and following in the law is always in everyone’s best interest. :)

…. :) reblogs and even reposts definitely welcome

(via justaddfiction)

9:05 pm  •  28 June 2023  •  35,693 notes

incel-rights-advocate-deactivat:

image

(via skygenders)

9:04 pm  •  28 June 2023  •  83,908 notes

transformationsproject:

Lots of Good News This Week!!

We are kicking off this week by celebrating good news from New Mexico and Washington State, where pro-trans legislation is advancing. Learn more below!

Good news! While the attack on transgender rights in the United States continues, we also saw progress on pro-trans legislation last week. These bills are examples of states protecting trans rights. Swipe through to learn more!ALT
New Mexico. New Mexico Governor Michelle Lujan Grisham signed SB13, putting into law her 2022 Executive Order 123. This ensures protections for gender-affirming care and reproductive health by prohibiting entities within New Mexico from sharing patient information with any out-of-state organization.ALT
Washington. The Washington State House of Representatives has signed HB1469, which will now head to the Governor to be signed. This law would establish Washington as another trans refuge state, where it would protect trans citizens fleeing their home states due to legal concerns.ALT
What you can do. To stay up-to-date on legislative victories, sign up for our weekly newsletter, delivered to your inbox every Friday. To learn more and contact your representatives, go to: transformationsproject.orgALT

(via bisexualpositivity)

9:02 pm  •  28 June 2023  •  1,987 notes

inneskeeper:

astralikacastle:

inneskeeper:

el-shab-hussein:

vague-humanoid:

cyberglittter:

image

being a woman is fucking exhausting. everything is created to disgrace our lives. this is horrifying.

https://www.washingtonpost.com/technology/2020/10/20/deep-fake-nudes/


The website promises to make “men’s dreams come true.” Users upload a photo of a fully clothed woman of their choice, and in seconds, the site undresses them for free. With that one feature, it has exploded into one of the most popular “deepfake” tools ever created.


Far more advanced than the now-defunct “DeepNude” app that went viral in 2019, this new site has amassed more than 38 million hits since the start of this year, and has become an open secret in misogynist corners of the web. (HuffPost is not naming the site in order to avoid directing further traffic to it.) It went offline briefly Monday after HuffPost reached out to its original web host provider, IP Volume Inc., which quickly terminated its hosting services. But the site was back up less than a day later with a new host — as is often the case with abusive websites.

“Hany Farid, a computer scientist at UC-Berkeley who specializes in digital-image forensics and was not involved in the original pix2pix research, said the fake-nude system also highlights how the male homogeneity of AI research has often left women to deal with its darker side.

AI researchers, he said, have long embraced a naive techno-utopian worldview that is hard to justify anymore, by openly publishing unregulated tools without considering how they could be misused in the real world.

“It’s just another way people have found to weaponize technology against women. Once this stuff gets online, that’s it. Every potential boyfriend or girlfriend, your employer, your family, may end up seeing it,” Farid said. “It’s awful, and women are getting the brunt of it.

“Would a lab not dominated by men have been so cavalier and so careless about the risks?” he added. “Would [AI researchers] be so cavalier if that bad [stuff] was happening to them, as opposed to some woman down the street?””

AI researchers, he said, have long embraced a naive techno-utopian worldview that is hard to justify anymore, by openly publishing unregulated tools without considering how they could be misused in the real world.

I don’t wanna tell you this is Not deeply wrong. It’s deeply wrong.

I just, people have been photoshopping celebrities manually onto nudes and writing porn about random real life people for a good long while now. This isn’t new, it’s just making it near-effortless on the part of any bad actor individual.

That’s an important part of the anti-AI movement, actually, so genuinely thanks for bringing it up!

Because you’re right! I’ve seen a lot of people defend AI against criticism of stuff like the use of deepfakes because “We’ve been creating misinformation since the days of film!”. They say that yes, misinfo is bad, but it’s important to remember that we’ve been doing the stuff that AI is currently able to do for decades and longer. It’s a new tool in the toolbox, as it were, but the base situation isn’t changed: You need to fact check, cite sources, and remember that people and governments lie on the internet.

The problem with this argument being something which negates anti-AI criticism is that it doesn’t take into account a very important factor:

Ease of use.

Sure, we’ve been doctoring photos since film and darkrooms were still a thing. We’ve been photoshopping celebrity faces onto naked bodies since we’ve had photoshop. Revenge porn exists, and revenge fake porn does too! There’s altered photos of women hugging cops that never needed DALL-E’s existence to happen. All of this is true.

But before AI as it is now, you had to actually be really good at doctoring images to be able to spread convincing misinformation. Bad and mediocre Photoshop jobs are extremely easy to notice, and many of the common ways you hide bad Photoshop jobs are also known and when they’re used, the image is treated with suspicion. Artifacting, artificial blur, and low pixel count/low quality images have been God’s gift to cryptids since we first saw Bigfoot walking that way. In recent years we’ve even gotten to be creative and combine that with 3d animation to create videos of that sort of thing! Neat!

Now all I need to do to create and spread extremely convincing misinformation is type into Stable Diffusion “Vladimir Putin and Bernie Sanders shaking hands”. Now all I need to do to make porn of literally any woman to every exist on the Internet is to use the new model talked about in the journal report. I don’t need to spend hours working on a single image (nor the months and years to learn to do so well and efficiently). I can generate dozens and dozens in a couple hours, all just by clicking a button or two, writing an alt-text for a image that never existed, or uploading a photo or two.

The problem right now with AI-ethics isn’t that people are able to plagiarize and steal from artists nor that people are able to design misinformation via doctored images.

The problem right now is “Now everyone can do it extremely quickly with zero effort and nearly no oversight legal or otherwise to stop them if they want”.

(via anarcho-smarmyism)

9:01 pm  •  28 June 2023  •  35,795 notes