• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: November 12th, 2024

help-circle



  • For Adobe Acrobat, well, it’s an Adobe product, and those are famously not compatible with Linux at all in any capacity. You’ll have to find something to replace it with or spin up a Windows virtual machine to run it.

    For Cyberpunk, where did you get it and and what have you tried? Also, consider volunteering any other information that might be helpful (distro, hardware, etc.), I don’t think either of us wants to play 20 Questions to help you troubleshoot.





  • Use dd! It’s a tool that allows you to copy the contents of anything bit-for-bit to anywhere else. First, you’ll need to boot into a live USB of any distro. Then, after plugging in both drives, you’ll want to run something like dd if=/path/to/source/drive of=/path/to/output/drive bs=4M. You can get the paths of each drive by running lsblk, and they’ll look something like /dev/sda1 or /dev/nvme0n1. (Be very careful with dd, as whatever you put as the output drive will be irreversibly overwritten with whatever you put the input drive as.)






  • I think that, while yes, LLMs are an option for data storage, I don’t think that they’re worth the effort. Sure, they might have a very wide breadth of information that would be hard to gather manually, but how can you be sure that the information you’re getting is a good replica of the source, or that the source that it was trained on was good in the first place? A piece of information could come from either 4chan or Wikipedia, and unless you had the sources yourself to confirm (in which case, why use the LLM as all), you’d have no way of telling which it came from.

    Aside from that, just getting the information out of it would be a challenge, at least for the hardware of today and the near future. Running a model large enough to have a useful amount of world knowledge requires a some pretty substantial hardware if you want any amount of speed that would be useful, and with rising hardware costs, that might not be possible for most people even years from now. Even with the software, if something with your hardware goes wrong, it might be difficult to get inference engines working on newer, unsupported hardware and drivers.

    So sure, maybe as an afterthought if you happen to have some extra space on your drives and oodles of spare RAM, but I doubt that it’d be worth thinking that much about.