- cross-posted to:
- opensource@lemmy.ml
- technology@beehaw.org
- cross-posted to:
- opensource@lemmy.ml
- technology@beehaw.org
Fascist regime and power/police abuse has started.
P.S.: It seems like the US is becoming similar to Russia, kleptocratic country and organised crime in government.
to be fair for black Americans that is a centuries old tune
Oh, you’re right
Most minorities — it’s the middle - upper class straight able bodied white people who are oblivious to it all.
Don’t worry, their already bad situation will get worse too.
Every step unchallenged is an invitation to do more.
I wasn’t thinking of downloading an AI onto my low tier computer until now.
I’ve got a laptop kicking around from 2010 that’s about to get deepseek just because they’re proposing this dumb ass shit. I don’t even use Gen AI.
Finally affordable housing!
same lmao
Print the code in a book and mail it.
Surely, they cannot ban books… right? Right?
Edit: Wait wait wait… the Comstock Act says mail cannot be used for anything that can be used for abortion. And a AI can theoretically be used to get instructions for abortion. BOOM, it’s banned! 👀
Wow, bold choice to ban the import of technology and knowledge. Usually governments are worried about export, so it doesn’t fall into the wrong hands.
Btw, how is the Nvidia stock price doing?
Right? Like, seriously, we all know somebody is just butthurt because their stock options tanked.
Oh, wait, I’m sorry! That was very unpatriotic of me, wasn’t it? I mean, we all know that winning an election guarantees being heavily rewarded with insider trading, right? It’s not like they’re there to represent constituents or anything; I mean, doesn’t everyone know we’re a republic, not a democracy?!
Sigh…
To be fair, this is common practice. Countries do this all the time to protect their economies. Mostly known in the West is China which banned many US services.
Of course, security of the data of the citizens is also a factor. You don’t want foreign countries use this data to interfere in any way.
Honestly, I don’t think this is common practice in non-oppressive countries. I mean sure, this happens in North Korea, Iran, China… But I’m relatively free to consume what I want with a few minor exceptions. For example we don’t import food that isn’t food-safe by our standards. Regardless if it’s common practice to eat it in other places. Also food may not be able to enter the country due to laws on animal cruelty. Similar things apply to electronic devices that aren’t up to code. And some select few things are banned altogether and you can’t have them and neither can someone import them. Other than that, regulations aren’t super strict. I can use all American social media platforms despite them stealing my personal data and violating European privacy laws regularly, can use Russian or Chinese websites… I think I live in a free country.
Helping domestic economy is done with tariffs / import tax. And not by banning things and putting people in jail.
And mind that this isn’t about the service that collects your data and gives it to the Chinese government. This is about downloading the model file and using it all by yourself. So no data gets transferred to a foreign country. And it’s not because people could get harmed or anything. This is just because the vice president doesn’t want it personally. Like in some dictatorship. Otherwise they would have banned transferring data into foreign countries, if that’s what it’s about. But they didn’t do that, because it’s not about protecting the people.
Or did I miss something and there are other examples for limitations on import?
No, I think you did not miss anything 😇
Good summary
now i gotta download something i don’t even wanna download.
Yup. Downloaded 7b, 32b, and 70b varieties this afternoon. Entirely out of spite.
Since those smaller models are technically fine-tunes of Meta/Facebook’s LLAMA, using Deepseek’s outputs, I wonder if they would be covered by the bill at all.
7b and 32b are Qwen2 🙃
I literally just did the same
For Base Model
git lfs install git clone https://huggingface.co/deepseek-ai/DeepSeek-V3-Base
For Chat Model
git lfs install git clone https://huggingface.co/deepseek-ai/DeepSeek-V3
this is deepseek-v3. deepseek-r1 is the model that got all the media hype: https://huggingface.co/deepseek-ai/DeepSeek-R1
Yea, comment OP needs to edit links with howany up votes that got.
Can you elaborate on the differences?
Base models are general purpose language models, mainly useful for AI researchers and people who want to build on top of them.
Instruct or chat models are chatbots. They are made by fine-tuning base models.
The V3 models linked by OP are Deepseek’s non-reasoning models, similar to Claude or ChatGPT4o. These are the “normal” chatbots that reply with whatever comes to their mind. Deepseek also has a reasoning model, R1. Such models take time to “think” before supplying their final answer; they tend to give better performance for stuff like math problems, at the cost of being slower to get the answer.
It should be mentioned that you probably won’t be able to run these models yourself unless you have a data center style rig with 4-5 GPUs. The Deepseek V3 and R1 models are chonky beasts. There are smaller “distilled” forms of R1 that are possible to run locally, though.
I heard people saying they could run the r1 32B model on moderate gaming hardware albeit slowly
32b is still distilled. The full one is 671b.
I know, but the fall off in performance isn’t supposed to be severe
You are correct. And yes that is kinda the whole point of the distilled models.
I know. Lmao
My legion slim 5 14" can run it not too bad.
https://www.deepseekv3.com/en/download
I was assuming one was pre-trained and one wasn’t but don’t think that’s correct and don’t care enough to investigate further.
Is that website legit? I’ve only ever seen https://www.deepseek.com/
And I would personally recommend downloading from HuggingFace or Ollama
r1 is lightweight and optimized for local environments on a home PC. It’s supposed to be pretty good at programming and logic and kinda awkward at conversation.
v3 is powerful and meant to run on cloud servers. It’s supposed to make for some pretty convincing conversations.
R1 isn’t really runnable with a home rig. You might be able to run a distilled version of the model though!
Tell that to my home rig currently running the 671b model…
That likely is one of the distilled versions I’m talking about. R1 is 720 GB, and wouldn’t even fit into memory on a normal computer. Heck, even the 1.58-bit quant is 131GB, which is outside the range of a normal desktop PC.
But I’m sure you know what version you’re running better than I do, so I’m not going to bother guessing.
It’s not. I can run the 2.51bit quant
You must have a lot of memory, sounds like a lot of fun!
You’re absolutely right, I wasn’t trying to get that in-depth, which is why I said “lightweight and optimized,” instead of “when using a distilled version” because that raises more questions than it answers. But I probably overgeneralized by making it a blanket statement like that.
Hawley’s statement called DeepSeek “a data-harvesting, low-cost AI model that sparked international concern and sent American technology stocks plummeting.”
data-harvesting
???
It runs offline… using open-source software that provably does not collect or transmit any data…
It is low-cost and out-competes American technology, though, true
sent American technology stocks plummeting
Oh yeah, thats what did it, totally
You don’t fuck with the big man money tbh… That’s like rule 1 of the game.
I’m gonna download it even harder.
See you hell evildoer!
This is astounding.
I mean, not the Deepseek or jailing stuff. I mean a Senator actually proposing a law. I thought the way our government worked was, the annoying orange declares a vague uncited threat to be bad, and signs an executive order on it!
No, we also allow mega corporations to submit bills that get rubber stamped by a rep somewhere. I don’t think a corporation would be so audacious as to submit this, so it’s a rare case of original content.
That’s awesome! I didn’t know you could download an LLM and run it locally! That’s what I’m really interested in is something that’s on my side and not a conduit to Google, MS or other.
I’m so glad Hawley proposed this bill or I wouldn’t have known that deepseek was open source and downloadable! I’ll have to go look for a download.
Ollama makes it pretty easy, and there are other runners as well. Good luck!
AFAICT it’s not open source, just open weights.
How to download/run a uncensored LLM locally:
Download the model and run locally is the most secure and privacy friendly way to use it.
It’s absurd how little they know about what they are doing.
I doubt they understand local vs server distinction.
“Server is when we ask Amazon to build a backdoor, local is when we ask Microsoft”
And that’s exactly why they want to stop it
Nah, Congress (esp the Senate) is a bunch of old people yelling at clouds, and sometimes they yell the same thing. Don’t give them too much credit.
deleted by creator
It’s easy to run a distilled version of the R1 model locally. It’s very difficult to run the full version. Min $6k to get 7 tokens per second.
Here’s one for 2k if you don’t mine jank (edit: and 3-4 tokens :) )
https://digitalspaceport.com/how-to-run-deepseek-r1-671b-fully-locally-on-2000-epyc-rig/
I hear its easy, but I’ve had no luck at all on the most distilled models (for prelim testing), and am wondering how things have broken so badly.
Land of the free
Land of the free only refers to owning guns.
Yes the ban on TikTok is working! We’re getting more and more freer!!! The kids will be saved!!! \s \s \s
“Victory for free speech (as long as it means only we get to talk”)! /s
I wasn’t gonna, but now I gotta…
You laugh, but stay safe