You can run a 7b model on cpu really fast even on a phone.
You can run a 7b model on cpu really fast even on a phone.
Only as mouse or shooter games
The elite is deleting all evidence, but he saw a youtube video from a random guy. That should be proof enough
Its less and less phones supported until it vanishes. Thats what i was meaning. Just more phones unrootable or really hard or no peoper cfw
Correct. Phone keyboard fed my writing lol
I like lineage os, but i tought its about do disappear?
doesnt work anyway lol
you need to learn the difference between opinion and fact then
Still its neither implemented in a1111,fooocus and comfyui. I acknowledge, it exists and certainly works, but was not adapted by any tool i know of
thats almost a year old. and their github 2 years. also i did train multiple loras with only ai generated images with sd1.5 and XL without any issue
Screenshot and its gone
Thats not how it works at all
Not really. Check midjourney v6 generated images. I found many images, which look undistinctable from real images. So i dont see, why image generation should get worse. What matters is the dataset and only dataset. It doesnt matter if the model is trained on ai images, as long as the dataset is good
Rtx3060 is loved so much with a reason
Bosch
Sooo how long until theres a plugin in a1111?
Thats what i said lol
Ok have to correct myself, they crawled back from this 2 weeks ago due to backlash, but i doubt they wont do it at least in a similar way or hidden like they did with reducing power on older devices to “save battery” https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
Yea thats why they look trough your images for “cp”
Even if it will plateau, same was said with moorrs law, which held up way longer than expected. There are so many ways to improve this. Open source community is getting to the point where you can actually run decent models on normal private hardware (talking about 70-120b model)