Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I understand no one wants to deal with unexpected issues, especially when you're trying to get something done. Why not use virtual environments? You keep packages out of your system's python, and if you run into issues like above you can just recreate it:

  b5n:~/venv_dir$ python3 -m venv im_a_venv
  b5n:~/venv_dir$ . im_a_venv/bin/activate
  (im_a_venv)b5n:~/venv_dir$ pip install requests
  (im_a_venv)b5n:~/venv_dir$ pip freeze > requirements.txt
  (im_a_venv)b5n:~/venv_dir$ deactivate

  b5n:~/venv_dir$ python3 -m venv another_venv
  b5n:~/venv_dir$ . another_venv/bin/activate
  (another_venv)b@b5n:~/venv_dir$ pip install -r requirements.txt


The reason I use linux and perl (or sometimes bash) - it just works, I get the least number of surprises. Dependencies in perl are handled by my OS same as everything else, I don't need to maintain multiple package managers, I just bash out the script and move on

I don't write software as an end goal, I write software to accomplish my end goal.


While experienced python devs will say, yeah that'd how you should do it. The problem is that python by default isn't using virtualenvs. Consider rust or is where you need to specify global installs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: