wsl --install
wsl.exe -l -o
wsl.exe --install Ubuntu-22.04
curl https://ollama.ai/install.sh | sh
ollama run mistral
Now you can ask question from LLM to get response at Windows powershell terminal.
Future work is where you can run that as API endpoint at Google Collab, VSCode and Ollama it's own chat interface.