Codersera

Run DeepSeek Janus-Pro 7B on Windows: A Step-by-Step Installation Guide

1. Check System Requirements

  • Ensure your Windows PC has:
    • At least 16GB of RAM.
    • A compatible NVIDIA GPU (for optimal performance).
    • Sufficient disk space (approximately 4.7 GB for the model).

2. Install Python

  • Download and install the latest version of Python from the official Python website.
  • During installation, ensure you check the box that says "Add Python to PATH."

3. Install Git

  • Download and install Git from the official Git website.
  • This will allow you to clone repositories from GitHub.

4. Install CUDA Toolkit (if using NVIDIA GPU)

  • Download and install the CUDA Toolkit from the NVIDIA website.
  • Follow the installation instructions specific to your system configuration.

5. Create a Hugging Face Account

  • Go to Hugging Face and sign up for a free account if you don't already have one.
  • Log in to your account.

6. Clone the DeepSeek Janus-Pro Repository

  • Open Command Prompt or PowerShell.
  • Run the following command to clone the repository:bashgit clone https://huggingface.co/deepseek-ai/Janus-Pro-7B
  • Navigate into the cloned directory:bashcd Janus-Pro-7B

7. Set Up a Virtual Environment

8. Install Required Packages

  • Install necessary dependencies using pip:bashpip install transformers torch torchvision torchaudio
  • Ensure you have libraries like TensorFlow or PyTorch installed, depending on which framework is used by Janus-Pro.

9. Download Model Weights

  • Access the model page on Hugging Face: Janus-Pro-7B Model Page.
  • Locate the Files and Versions section.
  • Click on the file(s) you need, such as model weights (.bin files) or configuration files (config.json), and download them manually.

10. Integrate the Model in Your Code

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("deepseek-ai/Janus-Pro-7B")
tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/Janus-Pro-7B")

11. Run the Model

  • To start using the model, execute your script with a prompt:python
input_text = "Describe a futuristic city."
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

12. Monitor Performance

  • Use Task Manager to monitor CPU and GPU usage while running the model to ensure your system is handling it effectively.

13. Troubleshooting

  • If you encounter errors, check for messages in your command line interface.
  • Make sure all dependencies are installed correctly and that your hardware meets necessary specifications.

Additional Resources

By following these steps, you should be able to successfully install and run the DeepSeek Janus-Pro 7B model on your Windows machine.

To run in on mac, check this link