Skip to main content

Uses

A list of the stuff I use.

πŸ–₯️ Operating Systems #


πŸ‘¨πŸΎβ€πŸ’» IDEs & Code Editors #

  • Positron - The natural evolution of RStudio.
  • RStudio – An IDE designed for R programming.
  • Zed Editor – A fast, collaborative code editor written in rust.
  • VSCode – A powerful code editor with a rich extension ecosystem.

>_ Terminal & CLI Tools #

  • Vim – A highly customizable and efficient text editor.
  • Htop & BPytop – Interactive system monitoring tools.
  • Nvtop – A real-time GPU usage monitor.
  • Bat – A cat alternative with syntax highlighting.
  • Starship – A minimal, customizable shell prompt.

🌐 Development & Deployment Tools #

  • Hugo – A fast static site generator.
  • Docker – A platform for containerizing applications.
  • GitLab – A DevOps platform for hosting repositories and CI/CD pipelines.

πŸ€– Desktop Applications #

  • Obsidian – A knowledge management and note-taking app using Markdown.
  • Zen Browser – A minimalist web browser based on Firefox.
  • Thunderbird – An open-source email client.
  • Ptyxis – A terminal emulator for better workflow management.
  • Buffer – A minimal editing space for all those things that don’t need keeping.

πŸ“± Mobile Applications #

  • Fountain – A podcasting app with value-for-value payments.
  • DAVx⁡ – Sync contacts and calendars with Nextcloud, OwnCloud, and other CalDAV/CardDAV services.
  • Termux – A terminal emulator with a Linux environment for Android.

☁️ Online Services & Self-Hosting #

  • Tailscale – A zero-config VPN for private networking.
  • Homeassistant - Open source home automation tool.
  • Searxng - A free and open-source federated metasearch engine.
  • Jellyfin – A self-hosted media streaming solution.
  • Audio Bookshelf – A self-hosted audiobook and podcast server.
  • Nextcloud – A private cloud solution for file sharing and collaboration.
  • ProtonMail – Encrypted email service focused on privacy.
  • Bitwarden – An open-source password manager.
  • Stirling-PDF – A self-hosted PDF processing server.
  • Ollama – A local LLM (large language model) runner for AI applications.