\n\n## General notes on shadcn registry\n\n\n For setting up the registry, I highly recommend going through the official\n shadcn guide, which does a great job of explaining the different steps. Check\n it out here: [Getting\n Started](https://ui.shadcn.com/docs/registry/getting-started).\n\n\n### registry-item.json\n\n- The `registry-item.json` schema is used to define your custom registry items.\n\n- You can see the JSON Schema for `registry-item.json` [here](https://ui.shadcn.com/schema/registry-item.json).\n\n- **registryDependencies:** As mentioned in this documentation section, you can specify dependencies for your component, page, or hook. If it's a shadcn/ui registry item, refer to it by name (e.g., `button`, `select`, `input`). For items from other registries, use the full URL (e.g., `https://hookcn.ouassim.tech/r/use-boolean`).\n\n- **devDependencies:** You can also specify development dependencies if needed.\n\n- **files**: You can list multiple files in the `files` property. This is useful for pages that may have multiple components, utilities, or hooks.\n\n ```json\n {\n \"$schema\": \"https://ui.shadcn.com/schema/registry-item.json\",\n \"name\": \"hello-world\",\n \"title\": \"Hello World\",\n \"type\": \"registry:block\",\n \"description\": \"A complex hello world component\",\n \"files\": [\n {\n \"path\": \"registry/hello-world/page.tsx\",\n \"type\": \"registry:page\",\n \"target\": \"app/hello/page.tsx\"\n },\n {\n \"path\": \"registry/hello-world/components/hello-world.tsx\",\n \"type\": \"registry:component\"\n },\n {\n \"path\": \"registry/hello-world/components/formatted-message.tsx\",\n \"type\": \"registry:component\"\n },\n {\n \"path\": \"registry/hello-world/hooks/use-hello.ts\",\n \"type\": \"registry:hook\"\n },\n {\n \"path\": \"registry/hello-world/lib/format-date.ts\",\n \"type\": \"registry:utils\"\n },\n {\n \"path\": \"registry/hello-world/hello.config.ts\",\n \"type\": \"registry:file\",\n \"target\": \"~/hello.config.ts\"\n }\n ]\n }\n ```\n\n### tailwind.config.ts\n\n- If you're placing registry components in a custom directory, update `tailwind.config.ts` to include that directory.\n\n ```ts\n export default {\n content: [\"./registry/**/*.{js,ts,jsx,tsx}\"],\n }\n ```\n\n### build command\n\n- Once you’ve added the build script to your `package.json`, running `pnpm run build` will, by default, search for `registry.json`, which contains the list of your registry items.\n\n- If you want to specify a custom path instead of using the default `registry.json`, pass it as an argument:\n\n ```bash\n pnpm run build /registry/registry.json\n ```\n\n- Output files for each component are generated in `public/r` by default. To change the output directory, use the `--output` option:\n\n ```bash\n pnpm run build --output /public/r/hooks\n ```\n\n- More details on the build command can be found in the [shadcn build command docs](https://ui.shadcn.com/docs/cli#build).\n\n### Imports\n\n- Always use the @/registry path for imports:\n\n ```tsx\n import { HelloWorld } from \"@/registry/hello-world/hello-world\"\n ```\n\n### Publish your registry\n\n- To make your registry available to other developers, you can publish it by deploying your project to a public URL.\n\n### next.config.ts\n\n- When serving your registry components, your URLs will likely follow this pattern:\n\n ```\n https://[DOMAIN_NAME]/r/[NAME].json\n ```\n\n- Typically, you place registry item JSON files in the `public` directory, but instead of referencing full paths, you can simplify things using Next.js [redirects](https://nextjs.org/docs/app/api-reference/config/next-config-js/redirects). Here's an example from my [hookcn](https://github.com/strlrd-29/hookcn/blob/main/next.config.ts) project:\n\n ```ts\n const nextConfig: NextConfig = {\n /* config options here */\n async redirects() {\n return [\n {\n source: \"/r/:name((?!index\\\\.json|hooks/).*)\",\n destination: \"/r/hooks/:name.json\",\n permanent: true,\n missing: [\n {\n type: \"query\",\n key: \"_redirected\",\n value: undefined,\n },\n ],\n },\n ]\n },\n }\n ```\n\n- This redirect ensures that any request starting with `/r/[NAME]` automatically points to its corresponding path inside the public directory.\n\n## Example repositories using the shadcn registry\n\nExploring real-world examples is the best way to see the shadcn registry in action. Here are some open-source projects that use it:\n\n- [shadcn/ui](https://github.com/shadcn-ui/ui)\n- [hookcn](https://github.com/strlrd-29/hookcn)\n- [magicui](https://github.com/magicuidesign/magicui)\n\n## Conclusion\n\nAlthough the shadcn registry is still an experimental feature, it’s definitely worth using if you need a shared component registry across multiple projects plus, shadcn is just awesome.\n\nThese are my general thoughts after building [hookcn](https://github.com/strlrd-29/hookcn) with it. Let me know what you think!\n\nPeace ✌️."],"filePath":[0,"src/content/notes/shadcn-registry-a-better-way-to-manage-your-ui-components.mdx"],"digest":[0,"115200fd5bef8062"],"deferredRender":[0,true],"collection":[0,"notes"]}],[0,{"id":[0,"setting-up-structured-logging-in-fastapi-with-structlog"],"data":[0,{"title":[0,"Setting Up Structured Logging in FastAPI with structlog"],"excerpt":[0,"Structured logging transforms how you debug and monitor applications. In this guide, we’ll set up structured logging in FastAPI using structlog, from configuring processors to adding correlation IDs for traceability. Build a robust logging system and make debugging a breeze!"],"publishDate":[3,"2025-02-02T00:00:00.000Z"],"tags":[1,[[0,"python"],[0,"logging"],[0,"API"]]],"image":[0,"../../assets/setting_up_structured_logging_in_fastapi_with_structlog.webp"]}],"body":[0,"## Introduction\n\nA few days ago, I was wandering through the open-source universe, exploring different projects, when I stumbled upon one that caught my attention. As I was going through its code, something stood out, the way it handled logging. The setup was clean, structured, and easy to read. That’s when I noticed it was using [structlog](https://www.structlog.org/en/stable/).\n\nCurious, I decided to dive deeper. What started as a casual look into logging quickly turned into a two-day deep dive. I found myself nerding out over logging best practices, structured logs, and how modern applications handle observability. That’s what led me to write this blog post, to share what I’ve learned about structured logging and how you can implement it in your [FastAPI](https://fastapi.tiangolo.com/) applications using structlog.\n\nIn this blog post, we’ll focus on structured logging, what it is, why it’s better than traditional logging, and how to set it up in FastAPI using structlog. We’ll go step by step, from installation to advanced configurations, ensuring that by the end of this post, you’ll have a solid structured logging setup for your FastAPI application.\n\n## Understanding Structured Logging\n\nTraditional logging often produces unstructured text, making it harder to parse and analyze logs at scale. Structured logging, on the other hand, formats logs as key-value pairs or JSON objects, making them machine-readable and easier to filter, search, and analyze.\n\nInstead of logging plain strings like this:\n\n```txt\n[INFO] User logged in: JohnDoe\n```\n\nStructured logging would store it in a structured format:\n\n```json\n{ \"level\": \"info\", \"event\": \"User logged in\", \"username\": \"JohnDoe\" }\n```\n\nThis approach makes logs more useful, especially when working with log aggregators, monitoring tools, or cloud services\n\n## Introducing structlog\n\nThe standard [logging](https://docs.python.org/3/library/logging.html) module in Python is powerful, but adding context to log messages can be tedious and inconsistent. structlog simplifies this by providing structured, contextual logging out of the box.\n\nConsider these log examples:\n\n```txt\nDEBUG:__main__:User example triggered deletion of hero 18bcb9d1-9ecd-4297-a7cc-591d0ededf7a\nINFO:__main__:Hero 18bcb9d1-9ecd-4297-a7cc-591d0ededf7a deleted successfully\n```\n\nWith structlog, we can achieve much more informative and structured logs:\n\n```json\n{\"timestamp\": \"2024-05-03T18:58:21.443989Z\", \"level\": \"debug\", \"event\": \"Triggering deletion of hero\", \"request_id\": \"97133231229d4af2a0588e468049261c\", \"hero_id\": \"18bcb9d1-9ecd-4297-a7cc-591d0ededf7a\", \"user_id\": \"example\"}\n{\"timestamp\": \"2024-05-03T18:58:21.509100Z\", \"level\": \"info\", \"event\": \"Hero deleted successfully\", \"request_id\": \"97133231229d4af2a0588e468049261c\", \"hero_id\": \"18bcb9d1-9ecd-4297-a7cc-591d0ededf7a\", \"user_id\": \"example\"}\n```\n\nThis structured approach makes it easy to track requests, correlate logs across services, and ingest logs into observability platforms.\n\nOne of the key features of structlog is [bound loggers](https://www.structlog.org/en/stable/bound-loggers.html). Instead of passing extra context with every log statement, you can bind contextual data once and reuse it:\n\n```python\nimport structlog\n\nlogger = structlog.get_logger().bind(service=\"user-service\", request_id=\"abc123\")\nlogger.info(\"User logged in\", username=\"JohnDoe\")\n```\n\nWhich outputs:\n\n```json\n{\n \"service\": \"user-service\",\n \"request_id\": \"abc123\",\n \"level\": \"info\",\n \"event\": \"User logged in\",\n \"username\": \"JohnDoe\"\n}\n```\n\nThis eliminates repetitive code and ensures every log contains the necessary context.\n\n## Integrating Structlog with FastAPI\n\nNow that we've covered the basics of structured logging and introduced structlog, it's time to dive into setting it up in a FastAPI application.\n\n### Install `structlog`\n\n```sh\npython -m pip install structlog\n```\n\n \n\n\n If you want pretty exceptions in development (you know you do!), additionally\n install either [Rich](https://github.com/Textualize/rich) or\n [better-exceptions](https://github.com/qix-/better-exceptions), Head to\n structlog's [installation\n guide](https://www.structlog.org/en/stable/getting-started.html#installation)\n for more info.\n\n\n### Settings\n\nTo make your logging configuration flexible, extend your settings with logging parameters using `pydantic-settings`.\n\n```py title=\"config.py\"\nfrom enum import StrEnum\n\nfrom pydantic_settings import BaseSettings, SettingsConfigDict\n\nclass Environment(StrEnum):\n development = \"development\"\n production = \"production\"\n # testing = \"testing\" - I don't write tests that's why this is commented out...\n\nclass Settings(BaseSettings):\n ENV: Environment = Environment.development\n DEBUG: bool = False\n LOG_LEVEL: str = \"DEBUG\"\n```\n\n### Configuring the logger\n\nThe core of our logging setup revolves around Structlog, which uses [processors](https://www.structlog.org/en/stable/processors.html) to format, filter, and enrich log entries. Processors are functions that take a log event, modify it, and pass it along to the next processor in the chain. This modular approach allows us to customize the logging pipeline to suit our needs.\n\nHere’s the complete `logging.py` file for reference:\n\n\n `Generic` type is only available in `3.12` and up.\n\n\n \n\n```python title=\"logging.py\"\n\nimport logging.config\nimport uuid\nfrom typing import Any, Generic, TypeVar\n\nimport structlog\nfrom structlog.typing import EventDict\n\nfrom src.config import settings\n\nRendererType = TypeVar(\"RendererType\")\n\n\nLogger = structlog.stdlib.BoundLogger\n\n\ndef get_level() -> str:\n return settings.LOG_LEVEL\n\n\ndef drop_color_message_key(_, __, event_dict: EventDict) -> EventDict:\n \"\"\"\n Uvicorn logs the message a second time in the extra `color_message`, but we don't\n need it. This processor drops the key from the event dict if it exists.\n \"\"\"\n event_dict.pop(\"color_message\", None)\n return event_dict\n\n\nclass Logging(Generic[RendererType]):\n \"\"\"Customized implementation inspired by the following documentation:\n\n https://www.structlog.org/en/stable/standard-library.html#rendering-using-structlog-based-formatters-within-logging\n \"\"\"\n\n timestamper = structlog.processors.TimeStamper(fmt=\"iso\")\n shared_processors = [\n structlog.contextvars.merge_contextvars,\n structlog.stdlib.add_log_level,\n structlog.stdlib.add_logger_name,\n structlog.stdlib.PositionalArgumentsFormatter(),\n drop_color_message_key,\n timestamper,\n structlog.processors.UnicodeDecoder(),\n structlog.processors.StackInfoRenderer(),\n ]\n\n @classmethod\n def get_processors(cls) -> list[Any]:\n if settings.is_production():\n cls.shared_processors.append(structlog.processors.format_exc_info)\n\n return cls.shared_processors + [\n structlog.stdlib.ProcessorFormatter.wrap_for_formatter\n ]\n\n @classmethod\n def get_renderer(cls) -> RendererType:\n raise NotImplementedError()\n\n @classmethod\n def configure_stdlib(\n cls,\n ) -> None:\n level = get_level()\n\n if settings.is_production():\n cls.shared_processors.append(structlog.processors.format_exc_info)\n\n logging.config.dictConfig(\n {\n \"version\": 1,\n \"disable_existing_loggers\": True,\n \"formatters\": {\n \"myLogger\": {\n \"()\": structlog.stdlib.ProcessorFormatter,\n \"processors\": [\n structlog.stdlib.ProcessorFormatter.remove_processors_meta,\n cls.get_renderer(),\n ],\n \"foreign_pre_chain\": cls.shared_processors,\n },\n },\n \"handlers\": {\n \"default\": {\n \"level\": level,\n \"class\": \"logging.StreamHandler\",\n \"formatter\": \"myLogger\",\n },\n },\n \"loggers\": {\n \"\": {\n \"handlers\": [\"default\"],\n \"level\": level,\n \"propagate\": False,\n },\n # Propagate third-party loggers to the root one\n **{\n logger: {\n \"handlers\": [],\n \"propagate\": True,\n }\n for logger in [\n \"uvicorn\",\n \"sqlalchemy\",\n \"arq\",\n ]\n },\n },\n }\n )\n\n @classmethod\n def configure_structlog(cls) -> None:\n structlog.configure_once(\n processors=cls.get_processors(),\n logger_factory=structlog.stdlib.LoggerFactory(),\n wrapper_class=structlog.stdlib.BoundLogger,\n cache_logger_on_first_use=True,\n )\n\n @classmethod\n def configure(cls) -> None:\n cls.configure_stdlib()\n cls.configure_structlog()\n\n\nclass Development(Logging[structlog.dev.ConsoleRenderer]):\n @classmethod\n def get_renderer(cls) -> structlog.dev.ConsoleRenderer:\n return structlog.dev.ConsoleRenderer(colors=True)\n\n\nclass Production(Logging[structlog.processors.JSONRenderer]):\n @classmethod\n def get_renderer(cls) -> structlog.processors.JSONRenderer:\n return structlog.processors.JSONRenderer()\n\n\ndef configure() -> None:\n if settings.is_development():\n Development.configure()\n else:\n Production.configure()\n\n\ndef generate_correlation_id() -> str:\n return str(uuid.uuid4())\n```\n\nIn our `logging.py` file, we define a set of shared processors that are used across both development and production environments, in production, we also add the `format_exc_info` processor to include exception tracebacks in the logs.\n\n#### Integrating with Python's Logging Module\n\nTo ensure compatibility with third-party libraries that use Python's standard `logging` module, we configure a custom formatter (`ProcessorFormatter`) that converts traditional logs into Structlog's structured format. This is done using the `logging.config.dictConfig` method, which allows us to define:\n\n- A custom formatter (`myLogger`) that uses Structlog's ProcessorFormatter to process logs.\n\n- A `StreamHandler` to output logs to the console.\n\n- Loggers for third-party libraries like Uvicorn, SQLAlchemy, and Arq, which are configured to propagate their logs to the root logger.\n\nThis setup ensures that all logs, whether they originate from our application or third-party libraries, are formatted consistently and enriched with structured information.\n\n#### Development vs. Production Logging\n\nWe’ve implemented two logging configurations: one for development and one for production.\n\n- **Development**: Uses `ConsoleRenderer` to output colorful, human-readable logs to the console. This is ideal for debugging and local development.\n\n- **Production**: Uses `JSONRenderer` to output logs in JSON format. This makes it easier to parse and analyze logs using tools like ELK Stack, Datadog, or Splunk.\n\n#### Correlation IDs for Traceability\n\nTo improve traceability across distributed systems, we’ve added a utility function (`generate_correlation_id`) to generate unique correlation IDs. These IDs can be added to the logging context using Structlog's [contextvars](https://www.structlog.org/en/stable/contextvars.html) integration, ensuring that all logs related to a single request share the same correlation ID.\n\n### Adding Correlation ID Middleware\n\nA correlation ID is a unique identifier that is passed along with a request and included in all logs related to that request. This allows us to trace the flow of a request through the system, even when it spans multiple services.\n\nIn our FastAPI application, we achieve this by creating a custom middleware called `LogCorrelationIdMiddleware`. This middleware generates a unique correlation ID for each incoming HTTP request and binds it to the logging context using Structlog's `contextvars` integration. Here's how it works:\n\n```python title=\"middlewares.py\"\nimport structlog\nfrom starlette.types import ASGIApp, Receive, Scope, Send\n\nfrom src.logging import generate_correlation_id\n\n\nclass LogCorrelationIdMiddleware:\n def __init__(self, app: ASGIApp) -> None:\n self.app = app\n\n async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:\n if scope[\"type\"] != \"http\":\n return await self.app(scope, receive, send)\n\n structlog.contextvars.bind_contextvars(\n correlation_id=generate_correlation_id(),\n method=scope[\"method\"],\n path=scope[\"path\"],\n )\n\n await self.app(scope, receive, send)\n\n structlog.contextvars.unbind_contextvars(\"correlation_id\", \"method\", \"path\")\n```\n\n### Bring it all together\n\nTo use the middleware in your FastAPI application, simply add it to your app:\n\n```python title=\"app.py\"\nfrom fastapi import FastAPI\nfrom src.middlewares import LogCorrelationIdMiddleware\nfrom src.loggins import configure as configure_logging\n\nconfigure_logging()\n\napp = FastAPI()\napp.add_middleware(LogCorrelationIdMiddleware)\n```\n\n## Using the logger\n\nNow that we have everything configured, we can finally go ahead and use our new logger. Here is a simple example that shows the general usage pattern:\n\n```python {5, 7, 11, 21, 25, 29}\nimport uuid\n\nfrom fastapi import APIRouter, HTTPException\nfrom fastapi_async_sqlalchemy import db\nimport structlog\n\nfrom src.logging import Logger\nfrom src.models import models, schemas\nfrom src.crud import crud_hero\n\nlogger: Logger = structlog.get_logger()\n\nrouter = APIRouter(prefix=\"/hero\", tags=[\"heroes\"])\n\n\n# Demonstrate logger usage\n@router.get(\"/{hero_id}/ability\", response_model=schemas.HeroSchema)\nasync def get_hero_ability(hero_id: uuid.UUID):\n\n # Bind requested log ID\n logger.bind(requested_hero_id=hero_id)\n hero = await crud_hero.get(db.session, hero_id)\n\n if not hero:\n logger.warning(\"Hero not found\")\n raise HTTPException(status_code=404, detail=\"Hero not found\")\n await db.session.refresh(hero, attribute_names=[\"ability\"])\n\n logger.info(\"Successfully fetched hero ability\")\n return hero.ability\n```\n\n## Conclusion\n\nSetting up structured logging with Structlog and FastAPI is a game-changer for debugging, monitoring, and maintaining your application. While there’s more to explore, like exception handling, integrating observability tools such as [Logfire](https://logfire.pydantic.dev/), and other advanced techniques, this guide serves as a solid introduction to logging. With the basics in place, you’re well-equipped to build on this foundation and tailor your logging setup to your needs.\n\nThanks for reading this far! Autobots, Roll Out."],"filePath":[0,"src/content/notes/setting-up-structured-logging-in-fastapi-with-structlog.mdx"],"digest":[0,"ab8ff07f1fd77251"],"deferredRender":[0,true],"collection":[0,"notes"]}],[0,{"id":[0,"syncing_and_backing_up_obsidian_notes_with_syncthing_and_github"],"data":[0,{"title":[0,"Syncing and Backing Up Obsidian Notes with Syncthing and GitHub"],"excerpt":[0,"Sync your Obsidian notes across devices with Syncthing and back them up securely with GitHub in this step-by-step guide."],"publishDate":[3,"2025-01-18T00:00:00.000Z"],"tags":[1,[[0,"tools"],[0,"productivity"]]],"image":[0,"../../assets/syncing_and_backing_up_obsidian_notes_with_syncthing_and_gitHub.webp"]}],"body":[0,"import { Image } from \"astro:assets\"\n\nimport image1 from \"../../assets/syncing_and_backing_up_obsidian_notes_with_syncthing_and_gitHub.webp\"\n\n\n\n## Introduction\n\nI’ve made [Obsidian](https://obsidian.md/) my go-to tool for organizing thoughts, capturing ideas, and maintaining a robust knowledge management system. Obsidian stores all your notes within a Vault which is basically the main folder containing subfolders, markdown files, images, and other assets. While Obsidian offers options like iCloud or their paid Obsidian Sync service for syncing and storing vaults, I wanted a setup that gives me more control over how and where my notes are managed.\n\nIn this article, I’ll share how I synchronize my notes across devices and ensure they’re securely backed up. My hope is to inspire you to create a system that works seamlessly for your needs.\n\n## Overview of the Setup\n\nTo synchronize notes across devices and back them up securely, I rely on two tools:\n\n- [**Syncthing**](https://syncthing.net/): A peer-to-peer file synchronization tool that ensures real-time syncing across all your devices.\n- [**GitHub**](https://github.com/): A version control platform that keeps your notes backed up with a complete history of changes, providing an added layer of security and accessibility.\n\n## Syncthing\n\n### Setting Up Syncthing on Laptop\n\nTo run Syncthing, I chose to use [Docker Compose](https://docs.docker.com/compose/) instead of installing it directly from the Arch Linux repositories. While Syncthing is readily available in the official package repositories, I wanted to avoid potential version mismatches across my devices. Using Docker ensures that I can run the same version consistently on both my desktop and server, simplifying maintenance and compatibility.\n\nHere’s the Docker Compose file I use:\n\n```yml title=\"docker-compose.yml\"\nservices:\n syncthing:\n image: syncthing/syncthing\n container_name: syncthing\n hostname: my-syncthing\n environment:\n - PUID=1000\n - PGID=1000\n volumes:\n - /home/ouassim/st-sync:/var/syncthing\n - /home/ouassim/ouassim-notes:/var/syncthing/ouassim-notes # path to your obsidian vault\n ports:\n - 8384:8384 # Web UI\n - 22000:22000/tcp # TCP file transfers\n - 22000:22000/udp # QUIC file transfers\n - 21027:21027/udp # Receive local discovery broadcasts\n restart: unless-stopped\n healthcheck:\n test: curl -fkLsS -m 2 127.0.0.1:8384/rest/noauth/health | grep -o --color=never OK || exit 1\n interval: 1m\n timeout: 10s\n retries: 3\n```\n\nThis configuration mounts two volumes:\n\n- `/home/ouassim/st-sync` for Syncthing's configuration and data.\n- `/home/ouassim/ouassim-notes` for the Obsidian Vault, making it available for syncing.\n\nIt also exposes the necessary ports for the Web UI, file transfers, and local discovery. The health check ensures Syncthing remains operational, with automatic retries if issues occur.\n\nThe `restart: unless-stopped` directive ensures that Syncthing will automatically restart if it crashes or if the device (server or laptop) is rebooted. The container will only stop if you explicitly stop it with `docker compose down`.\n\nTo start up the container, simply run the following command in the directory where your `docker-compose.yml` is located:\n\n```sh\ndocker compose up -d\n```\n\nThis will start the Syncthing container in detached mode, allowing it to run in the background. Once Syncthing is running, you can access the Web UI by navigating to [http://localhost:8384](http://localhost:8384) in your browser.\n\n\n Make sure you setup password authentication, since by default the web\n interface isnt't protected by anything.\n\n\n### Creating a Shared Folder for Your Obsidian Vault\n\nOnce Syncthing is up and running, the next step is to create a shared folder for your Obsidian Vault. You can either select an existing Vault or create a new one specifically for syncing.\n\n#### Steps to Set Up the Shared Folder\n\n1. Open the Syncthing Web UI at [`http://localhost:8384`](http://localhost:8384).\n2. Click on **Add Folder** and configure the following:\n\n - **Folder Label**: A descriptive name for your Obsidian Vault (e.g., \"Obsidian Notes\").\n - **Folder Path**: The location of your Vault on the device (e.g., `/home/ouassim/ouassim-notes`).\n > Note: This path must match the folder you mounted in the Docker Compose volumes we shared earlier.\n\n3. Set up ignore rules to exclude unnecessary files. Navigate to the Ignore Patterns tab for the folder and add the following rules:\n\n ```txt title=\".stignore\"\n // WARNING: Syncthing does not synchronise this file (.stignore)\n // During setup on a new device, create a copy of the .stignore file located under \"50 Resources/\"\n\n // most important one. this keeps track of your open panes and files in the app\n .obsidian/workspace\n .obsidian/workspace.json\n\n // vault stats are not useful\n .vault-stats\n\n // Ignore the Git repository\n .git\n .gitignore\n ```\n\n These patterns exclude temporary files, Git-related files, and Obsidian-specific workspace configurations that don’t need to be synced.\n\n\n Syncthing supports file versioning to retain previous file versions, but I\n chose not to enable it as GitHub already handles version control and backups,\n which I'll cover later.\n\n\n### Setting Up Syncthing on Android\n\nTo synchronize my Obsidian Vault with my Android phone, I used [Syncthing-Fork](https://github.com/Catfriend1/syncthing-android), an optimized version of Syncthing available on the [Play Store](https://play.google.com/store/apps/details?id=com.github.catfriend1.syncthingandroid&hl=en). It’s lightweight and has a much lower impact on battery life, making it my top recommendation for Android users.\n\n#### Connecting the Android Phone to the PC\n\n1. **Install Syncthing-Fork** on your phone from the Play Store.\n2. Add your PC as a remote device:\n\n - On your phone, go to **Devices > Add > Scan QR Code**.\n - On your PC, open the Syncthing Web UI and click **Actions > Show ID**.\n - Scan the QR code from your phone.\n\n3. Accept the connection on both the phone and PC.\n\nThis process works seamlessly when both devices are on the same WiFi network.\n\n#### Sharing the Obsidian Vault Folder\n\n1. Create an empty folder at the root of your phone’s file system to store the synchronized files (e.g., `/storage/emulated/0/ObsidianVault`).\n2. On your PC, share the Obsidian Vault folder with your Android phone:\n - Open the folder settings in the Syncthing Web UI and click **Edit > Sharing**.\n - Select your phone as the target device.\n3. Accept the share notification on your phone.\n\n#### Configuring the Shared Folder on Android\n\n1. On your phone, set the newly created folder (e.g., `/storage/emulated/0/ObsidianVault`) as the destination for the shared files.\n2. Once synchronization starts, copy the `.stignore` file from your Vault to the root of the synced folder on your phone to apply the same ignore rules.\n\n#### Opening the Folder in Obsidian\n\nFinally, open the synchronized folder as a Vault in Obsidian on your phone. Once configured, Syncthing will automatically keep your notes synchronized across devices!\n\n## Backing Up Your Notes to GitHub\n\nWhile Syncthing keeps my notes synchronized across devices, I rely on GitHub for backups and version control. By using Git, I can track changes, revert to previous versions, and have an offsite backup in case of data loss.\n\n### Step 1: Initialize a Git Repository\n\n1. Navigate to your Obsidian Vault folder on your PC.\n2. Open a terminal and run the following commands:\n\n ```sh\n git init\n git remote add origin https://github.com//.git\n ```\n\n Replace `` and `` with your GitHub username and repository name.\n\n### Step 2: Create a `.gitignore` File\n\nTo avoid pushing unnecessary files to GitHub, create a `.gitignore` file in your Vault with the following content:\n\n```txt title=\".gigignore\"\n# Ignore vault stats\n.vault-stats\n\n# Ignore Syncthing versioned files\n\n.stversions\n```\n\n### Step 3: Commit and Push Changes\n\n1. Stage all files for the initial commit:\n\n ```sh\n git add .\n ```\n\n2. Commit the changes:\n\n ```sh\n git commit -m \"Initial commit of Obsidian Vault\"\n\n ```\n\n3. Push the changes to GitHub:\n\n ```sh\n git branch -M main\n git push -u origin main\n ```\n\n## Conclusion\n\nBy combining Syncthing and GitHub, I’ve set up a robust system for syncing and backing up my Obsidian notes across devices.\n\nThis setup might not be the simplest, but it’s tailored to my needs for privacy, flexibility, and control. If you’re looking for a similar solution, I hope this guide inspires you to take control of your own note synchronization and backup workflow.\n\nIf you have any questions or alternative approaches, feel free to share, I’d love to hear how others tackle this challenge!\n\nThanks for reading, Happy note-taking! 🚀"],"filePath":[0,"src/content/notes/syncing_and_backing_up_obsidian_notes_with_syncthing_and_gitHub.mdx"],"digest":[0,"8af15e93ba4421ad"],"deferredRender":[0,true],"collection":[0,"notes"]}],[0,{"id":[0,"boost_your_tailwind_workflow_with_eslint_and_prettier"],"data":[0,{"title":[0,"Boost Your Tailwind Workflow with ESLint and Prettier"],"excerpt":[0,"Consistency is key in any project, and Tailwind CSS is no exception. Learn how to set up your tools to enforce class name order and keep everything neat using ESLint and Prettier plugins tailored for Tailwind CSS."],"publishDate":[3,"2025-01-12T00:00:00.000Z"],"tags":[1,[[0,"tailwind"],[0,"setup"],[0,"tools"]]],"image":[0,"../../assets/boost-your-tailwind-workflow-with-eslint-and-prettier.webp"]}],"body":[0,"## Introduction\n\n[Tailwind CSS](https://tailwindcss.com/) is an essential tool for building modern UIs, but let’s be real, managing its class names can get messy fast. Ever found yourself staring at an HTML tag loaded with an endless string of utility classes? Or maybe you’ve caught yourself wondering, “Did I already add ‘px-4’ here?” Keeping class names consistent and ordered isn’t just about aesthetics; it’s about maintaining sanity in collaborative projects.\n\nIf you’ve ever struggled with disorganized Tailwind classes, you’re in the right place. In this post, I’ll walk you through how to make your Tailwind CSS setup strict and efficient. Using the [Prettier Tailwind CSS plugin](https://github.com/tailwindlabs/prettier-plugin-tailwindcss), you’ll learn how to sort class names following the official Tailwind team’s recommendations. Additionally, we’ll explore the [ESLint Tailwind CSS plugin](https://www.npmjs.com/package/eslint-plugin-tailwindcss), which not only enforces class order but offers even more capabilities to improve your workflow. By the end, you’ll have a clean, consistent codebase that’s easier to read, debug, and scale.\n\nLet’s dive in and clean up that class name chaos.\n\n## Tutorial: Setting Up a Strict Tailwind CSS Workflow\n\n\n I previously wrote\n [this](/notes/setting-up-a-nextjs-project-with-essential-best-practices) where\n I went into more detail on setting up both ESLint and Prettier. Feel free to\n check it out if you're interested!\n\n\n### Step 1: Install Required Dependencies\n\nTo get started, you’ll need to install some packages that will make this setup possible. Run the following command to add them to your project:\n\n> I will be using [`pnpm`](https://pnpm.io/) as my package manager, but feel free to use whatever works for you.\n\n```bash\npnpm install -D prettier prettier-plugin-tailwindcss eslint eslint-plugin-tailwindcss\n```\n\nIn addition to core ESLint and Prettier packages themselves, this installs both the [Prettier Tailwind CSS plugin](https://github.com/tailwindlabs/prettier-plugin-tailwindcss) for sorting class names and the [ESLint Tailwind CSS plugin](https://www.npmjs.com/package/eslint-plugin-tailwindcss) for validating and enforcing best practices.\n\n### Step 2: Configure Prettier\n\nNext, let’s set up Prettier to automatically sort Tailwind CSS class names. Open your `.prettierrc` (or [equivalent](https://prettier.io/docs/en/configuration.html)) file (or create one if it doesn’t exist) and add the following configuration:\n\n```txt title=\".prettierrc\" ins={2}\n{\n \"plugins\": [\"prettier-plugin-tailwindcss\"]\n}\n```\n\nPrettier will now automatically sort Tailwind CSS class names every time you format your code.\n\n### Step 3: Configure ESLint\n\n\n The steps below are for ESLint 8, which is soon to be deprecated, so you might\n consider upgrading your version. Check our this [migration\n guide](https://eslint.org/docs/latest/use/configure/migration-guide) for that.\n\n\nNow, let’s configure ESLint to enhance your Tailwind CSS setup. Open your `.eslintrc.json` file and update it with these settings:\n\n```json title=\".eslintrc.json\" ins={2,3}\n{\n \"plugins\": [\"tailwindcss\"],\n \"extends\": [\"plugin:tailwindcss/recommended\"]\n}\n```\n\n \n\n\nIf you’re using utility functions like cn from [shadcn/ui](https://ui.shadcn.com/) or cva from the [`class-variance-authority`](https://cva.style/docs) package, you’ll need to update your ESLint configuration to recognize these functions. Add the following to your `.eslintrc.json` file:\n\n```json title=\".eslintrc.json\"\n\"settings\": {\n \"tailwindcss\": {\n \"callees\": [\"cn\", \"cva\"],\n \"config\": \"tailwind.config.mjs\" // Update this path to point to your tailwind config.\n },\n},\n```\n\nThis is particularly useful when you dynamically generate class names, ensuring even complex class strings follow Tailwind's rules.\n\n\n\n \n\n\n The steps below are for ESLint 9.\n\n\nIn ESLint 9, they moved to use a new flat config format (typically configured in an `eslint.config.js` file), and they provide a very helpful [configuration migrator](https://www.npmjs.com/package/@eslint/migrate-config) that we can run on our existing config file to convert it to the new format. To use it do the following to your existing configuration file (`.eslintrc`, `.eslintrc.json`, `.eslintrc.yml`):\n\n```bash\npnpm dlx @eslint/migrate-config .eslintrc.json\n```\n\nThis will create a starting point for your `eslint.config.js` file but is not guaranteed to work immediately without further modification.\n\nIf you want to do it manually, you can create a new `eslint.config.js` file and paste the following to it.\n\n```js title=\".eslint.config.js\"\nimport path from \"node:path\"\nimport { fileURLToPath } from \"node:url\"\nimport js from \"@eslint/js\"\nimport { FlatCompat } from \"@eslint/eslintrc\"\n\nconst __filename = fileURLToPath(import.meta.url)\nconst __dirname = path.dirname(__filename)\nconst compat = new FlatCompat({\n baseDirectory: __dirname,\n recommendedConfig: js.configs.recommended,\n allConfig: js.configs.all,\n})\n\nexport default [\n ...compat.extends(\"plugin:tailwindcss/recommended\"),\n {\n settings: {\n tailwindcss: {\n callees: [\"cn\", \"cva\"],\n config: \"tailwind.config.mjs\", // Update this path to point to your tailwind config.\n },\n },\n },\n]\n```\n\n### Step 5: Verify the setup\n\nWith everything configured, it’s time to test the setup. Create a file with some unordered Tailwind CSS classes, like this:\n\n```jsx\n
\n```\n\nRun Prettier to format the file, and ESLint to validate it. After formatting, the class names will be ordered automatically:\n\n```jsx\n
\n```\n\n### Step 6: Add to Your Workflow\n\nTo make the most of this setup, integrate these tools into your workflow:\n\n- Use Prettier with your code editor (e.g., enable auto-formatting in VS Code).\n- Add ESLint to your CI/CD pipeline or Git hooks using a tool like Husky to enforce rules before committing code.\n\n> Check our my [previous blog post](/notes/setting-up-a-nextjs-project-with-essential-best-practices) for a detailed guide on optimizing your workflow.\n\n## Conclusion\n\nKeeping class names organized and consistent is crucial for maintaining a clean and readable codebase. With a little effort upfront, you can simplify your workflow, avoid common pitfalls, and create a more collaborative coding environment. Consistency makes all the difference.\n\nThanks for reading this far."],"filePath":[0,"src/content/notes/boost_your_tailwind_workflow_with_eslint_and_prettier.mdx"],"digest":[0,"872106d5ef19eaf3"],"deferredRender":[0,true],"collection":[0,"notes"]}],[0,{"id":[0,"ghostty-my-new-daily-driver"],"data":[0,{"title":[0,"Ghostty, My new daily driver (sorry Alacritty)"],"excerpt":[0,"After years of exploring different terminal emulators, I’ve finally found one that stands out, Ghostty. With its intuitive features, sleek design, and impressive performance, it’s now my daily driver. Here’s why Ghostty might just be the terminal emulator you’ve been waiting for."],"publishDate":[3,"2024-12-27T00:00:00.000Z"],"tags":[1,[[0,"linux"],[0,"ricing"]]],"image":[0,"../../assets/ghostty-my-new-daily-driver-sorry-alacritty.webp"]}],"body":[0,"import { Image } from \"astro:assets\"\n\nimport image1 from \"../../assets/ranger-ghostty.webp\"\nimport image2 from \"../../assets/ghostty-pywal.webp\"\n\n## Introduction\n\nOver the years, I’ve had the chance to use a wide range of terminal emulators, [Windows Terminal](https://github.com/microsoft/terminal), [Alacritty](https://alacritty.org/), [st](https://st.suckless.org/), and many others. Each had its quirks, strengths, and weaknesses, but none ever felt like the perfect fit.\n\nThen came the public release of [Ghostty 1.0](https://ghostty.org/). Seeing the buzz on Twitter and the excitement from developers raving about its features, I couldn’t resist giving it a try.\n\nAnd let me tell you, it delivered.\n\nAfter just a few hours of testing, Ghostty convinced me to make the switch from Alacritty, something I didn’t expect to happen anytime soon. It’s sleek, intuitive, and packed with features I didn’t even realize I needed. In this post, I’ll explore what makes Ghostty special, highlight some of my favorite features, and explain why it’s now my terminal of choice.\n\n## What is Ghostty?\n\nGhostty is a terminal emulator designed to be fast, feature-rich, and cross-platform, utilizing platform-native user interfaces and GPU acceleration.\n\nDeveloped by [Mitchell Hashimoto](https://mitchellh.com/), known for creating tools like [Vagrant](https://vagrantup.com/) and [Terraform](https://www.terraform.io/), Ghostty aims to provide a seamless and efficient command-line experience for developers and system administrators.\n\n## Why Ghostty Stands Out\n\nGhostty isn’t just another terminal emulator, it brings features that truly set it apart. Here are some of my favorite features:\n\n- **Simple Configuration**: Setting up Ghostty is refreshingly straightforward. Unlike some emulators that require fiddling with complex files, Ghostty’s configuration process is intuitive and beginner-friendly while still offering power for advanced users.\n\n here is my config for my current setup:\n\n ```ini title=\"config\"\n theme=\"/home/ouassim/.cache/wal/colors-ghostty\"\n font-size=22\n window-decoration=false\n\n confirm-close-surface = false\n clipboard-paste-protection = false\n unfocused-split-opacity = 1.0\n copy-on-select = clipboard\n\n background-opacity=0.8\n\n cursor-style = block\n cursor-style-blink = false\n\n shell-integration-features = no-cursor\n\n term = kitty\n ```\n\n \n \n For a full reference for my dotfiles you can find them\n [here](https://github.com/strlrd-29/.dotfiles).\n \n\n- **Terminal Splitting:** Need to run multiple commands in parallel? Ghostty’s native splitting feature allows you to divide your workspace into multiple panes effortlessly, keeping everything within a single window without the need for external tools like tmux.\n\n- **KiTTY Image Protocol Support:** Ghostty supports KiTTY’s image protocol, allowing you to view images directly in the terminal, no need for a separate viewer. This flexibility even extends to running graphical applications like [DOOM in the terminal](https://x.com/mitchellh/status/1813417577630068827), showcasing Ghostty’s impressive performance.\n\n## Tips\n\n### Use kitty to preview images with ranger\n\n1. Install [pillow](https://pypi.org/project/pillow/).\n2. Update your [ranger](https://github.com/ranger/ranger) config (it should be under `~/.config/ranger/rc.conf`) to include: `set preview_images true` and `set preview_images_method kitty`.\n3. Update your Ghostty config to include `term=kitty`.\n\nand voilà! now you should be able to preview images directly in your terminal.\n\n\n\n\n Use `zi` to toggle image preview in ranger.\n\n\n### Use Pywal with Ghostty\n\n1. Create a new template file for ghostty ([reference](https://ghostty.org/docs/features/theme#example)), and place it under `~/.config/wal/templates/colors-ghostty`\n\n```ini title='colors-ghostty'\npalette = 0={color0}\npalette = 1={color1}\npalette = 2={color2}\npalette = 3={color3}\npalette = 4={color4}\npalette = 5={color5}\npalette = 6={color6}\npalette = 7={color7}\npalette = 8={color8}\npalette = 9={color9}\npalette = 10={color10}\npalette = 11={color11}\npalette = 12={color12}\npalette = 13={color13}\npalette = 14={color14}\npalette = 15={color15}\nbackground = {background}\nforeground = {foreground}\ncursor-color = {cursor}\nselection-background = {background}\nselection-foreground = {foreground}\n```\n\n2. Update you ghostty config to include `theme=\"/home/<$USER>/.cache/wal/colors-ghostty\"`. Make sure to replace `$USER` with your user name.\n\n3. Run Pywal on a wallpaper.\n\n4. Restart Ghostty and it should work 🚀.\n\n\n\n## Conclusion\n\nFor now, I’m making Ghostty my daily driver, and I’m genuinely excited to see how it evolves with future updates. If you’re curious to try it yourself, I highly recommend checking it out.\n\nYou can find more details and join the conversation with other users on [Ghostty’s Discord](https://discord.gg/ghostty) or dive straight into it by visiting their official [website](https://ghostty.org/). Happy hacking!"],"filePath":[0,"src/content/notes/ghostty-my-new-daily-driver.mdx"],"digest":[0,"4227ebb5c90d6450"],"deferredRender":[0,true],"collection":[0,"notes"]}],[0,{"id":[0,"how-i-use-gnu-stow-to-manage-my-dotfiles"],"data":[0,{"title":[0,"How I use GNU Stow to organize my dotfiles"],"excerpt":[0,"Discover how GNU Stow simplifies the process of managing your dotfiles with symlinks, allowing for clean organization and seamless synchronization across machines."],"publishDate":[3,"2024-11-28T00:00:00.000Z"],"tags":[1,[[0,"dotfiles"],[0,"linux"],[0,"ricing"]]],"image":[0,"../../assets/how-i-use-gnu-stow-to-organize-my-dotfiles.webp"]}],"body":[0,"import { Image } from \"astro:assets\"\n\nimport image1 from \"../../assets/how-i-use-gnu-stow-to-organize-my-dotfiles.webp\"\n\n\n\n## Introduction\n\nOver the past year, I’ve gone through multiple solutions for managing my dotfiles. From using [bare repositories](https://www.atlassian.com/git/tutorials/dotfiles) to tools like [yadm](https://yadm.io). However, none felt efficient enough for my needs. Knowing how [symlinks](https://en.wikipedia.org/wiki/Symbolic_link) work, I remembered reading about using them to manage dotfiles and decided to explore this idea further. That's how I discovered [GNU stow](https://www.gnu.org/software/stow/).\n\n## Dotfiles\n\nAs an average Arch Linux enjoyer, I do have some configs laying around in my `home` directory, you can find my [dotfiles repository](https://github.com/strlrd-29/.dotfiles) on GitHub. Here’s how the repository is organized:\n\n```sh\n$ tree ~/.dotfiles\n/home/ouassim/.dotfiles\n├── alacritty\n├── flameshot\n├── git\n├── i3\n├── lazygit\n├── nvim\n├── picom\n├── polybar\n├── ranger\n├── rofi\n├── tmux\n├── wal\n├── zed\n└── zsh\n```\n\nEach directory groups related configurations, but the structure doesn’t directly affect the output.\n\n## About GNU Stow\n\nGNU Stow might not immediately seem like a natural fit for managing dotfiles, but it’s surprisingly effective. According to its official description:\n\n> GNU Stow is a symlink farm manager which takes distinct packages of software and/or data located in separate directories on the filesystem, and makes them appear to be installed in the same place.\n\nIn simpler terms, Stow mirrors the structure of one directory into another by creating symbolic links, making it perfect for managing a version-controlled directory of dotfiles and linking them to their appropriate locations.\n\n## How GNU Stow Works\n\nTo use Stow effectively, it’s helpful to understand these concepts:\n\n- **Package**: A set of files or directories to be \"installed.\" For dotfiles, this is a folder containing related configuration files.\n- **Stow directory**: The parent directory holding one or more packages.\n- **Target directory**: The location where symlinks will be created.\n\nWhen you \"stow\" a **package**, it creates symlinks in the **target directory** that point into the **package**.\n\nLet's say I have my dotfiles repository located in `~/.dotfiles`. Within this repository, I have a `zsh` **package**, containing the `.zshrc` dotfile:\n\n```sh\n$ pwd\n/home/ouassim/.dotfiles # <- repository\n\n$ find zsh\nzsh # <- package\nzsh/.zshrc # <- dotfile\n```\n\nThe **target directory** is my `home` directory, as this is where the symlinks need to be created. I can now stow the `zsh` package into the target directory like so:\n\n```sh\nstow --target=/home/ouassim zsh\n```\n\nStow will now create appropriate symlinks of the package into the target directory:\n\n```sh\n$ ls -l ~/.zshrc\nlrwxrwxrwx 1 ... /home/ouassim/.zshrc -> .dotfiles/zsh/.zshrc\n```\n\n \n\n\nThe default value for the `--target` option is the directory above the stow directory, in our case since our stow directory is `/home/ouassim/.dotfiles`, the target directory is `/home/ouassim`. Thus, the command can be simplified:\n\n```sh\nstow zsh\n```\n\n\n\nNote that you can stow packages that contain several files or even a complex directory structure. Let's look at the configuration for neovim which lives below `~/.config/nvim`:\n\n```sh\n$ pwd\n/home/ouassim/.dotfiles\n\n$ find nvim\nnvim\nnvim/.config\nnvim/.config/nvim\nnvim/.config/nvim/init.lua\n...\n```\n\nTo stow the neovim package:\n\n```sh\nstow nvim\n```\n\nVerify the symlink:\n\n```sh\n$ ls -l ~/.config/nvim\nlrwxrwxrwx 1 ... nvim -> ../.dotfiles/nvim/.config/nvim\n```\n\n## Removing Symlinks\n\nTo remove a package’s symlinks (unstow), use the `-D` or `--delete` option:\n\n```sh\n# unstow zsh package\nstow -D zsh\n```\n\nThis removes the symlink:\n\n```sh\n$ ls -l ~/.zshrc\nls: cannot access '/home/ouassim/.zshrc': No such file or directory\n```\n\n## Ignoring files and directories\n\nStow, by default, ignores certain files and directories, as defined in its [built-in ignore list](https://www.gnu.org/software/stow/manual/stow.html#Types-And-Syntax-Of-Ignore-Lists). You can customize this behavior by adding a `.stow-local-ignore` file in your stow directory.\n\nThe default ignore file includes:\n\n```txt\n# Comments and blank lines are allowed.\n\nRCS\n.+,v\n\nCVS\n\\.\\#.+ # CVS conflict files / emacs lock files\n\\.cvsignore\n\n\\.svn\n_darcs\n\\.hg\n\n\\.git\n\\.gitignore\n\\.gitmodules\n\n.+~ # emacs backup files\n\\#.*\\# # emacs autosave files\n\n^/README.*\n^/LICENSE.*\n^/COPYING\n```\n\n \n\n\n When you use a custom ignore file, it overrides the default settings. Be sure\n to include any defaults (e.g., `.git`, `README.md`) you still want to ignore.\n\n\n## Automating with a Makefile\n\nTo simplify the process, I use a `Makefile` in my dotfiles repository:\n\n```zsh\nall:\n stow --verbose --target=$$HOME --restow */\n\ndelete:\n stow --verbose --target=$$HOME --delete */\n```\n\nThe `--restow` flag ensures outdated symlinks are removed before creating new ones. Updating or cleaning up my dotfiles has become as simple as:\n\n```zsh\nmake # Update symlinks\nmake delete # Remove all symlinks\n```\n\n## Version Controlling Your Dotfiles\n\nUsing Git to track your dotfiles helps keep changes organized and enables easy synchronization across machines:\n\n```zsh\ngit init\ngit add .\ngit commit -m \"storing initial dotfiles\"\n```\n\nAdd a `.gitignore` to exclude unwanted files and a `README.md` to document your setup.\n\n## Conclusion\n\nI hope this has helped you understand better how to manage your dotfiles. While this method may not be for everyone, it’s my preferred approach.\n\nYou can check out my dotfiles repository [here](https://github.com/strlrd-29/.dotfiles). It’s a work in progress, but it might give you some inspiration.\n\nThanks for reading, PEACE ✌️."],"filePath":[0,"src/content/notes/how-i-use-gnu-stow-to-manage-my-dotfiles.mdx"],"digest":[0,"65ce5124029e10af"],"deferredRender":[0,true],"collection":[0,"notes"]}],[0,{"id":[0,"setting-up-a-nextjs-project-with-essential-best-practices"],"data":[0,{"title":[0,"Setting up a Next.js Project with Essential Best practices"],"excerpt":[0,"Learn to set up a Next.js project with essential best practices that enhance code quality, maintainability, and team productivity. This guide covers Prettier, ESLint, Git hooks, and more."],"publishDate":[3,"2024-11-14T00:00:00.000Z"],"tags":[1,[[0,"nextjs"],[0,"developer-experience"],[0,"tools"]]],"image":[0,"../../assets/setting-up-a-nextjs-project-with-essential-best-practices.webp"]}],"body":[0,"import { PinIcon } from \"lucide-react\"\nimport { Image } from \"astro:assets\"\n\nimport image1 from \"../../assets/rest-here-meme.webp\"\n\n## Introduction\n\nSetting up a Next.js project isn’t just about installing dependencies, it’s about creating a maintainable, high-quality foundation. In this guide, we’ll cover a proven setup for essential tools like Prettier, ESLint, Husky, and Commitlint to ensure consistency, catch issues early, and improve team workflows, helping you stay organized and productive.\n\n## Project Setup\n\n\n I'm using node `20.18.0` and{\" \"}\n pnpm `9.12.3` for this guide.\n\n\nStart by creating a new Next.js project:\n\n```bash\npnpm create next-app@latest\n```\n\nName your project when prompted, and select `Yes` for ESLint setup.\n\n```txt frame=\"terminal\"\n✔ What is your project named? … next-template\n✔ Would you like to use TypeScript? … No / Yes\n✔ Would you like to use ESLint? … No / Yes\n✔ Would you like to use Tailwind CSS? … No / Yes\n✔ Would you like your code inside a `src/` directory? … No / Yes\n✔ Would you like to use App Router? (recommended) … No / Yes\n✔ Would you like to use Turbopack for next dev? … No / Yes\n✔ Would you like to customize the import alias (@/* by default)? … No / Yes\nCreating a new Next.js app in /home/ouassim/next-template.\n```\n\nAfter the prompts, `create-next-app` creates a folder named after your project and installs the required dependencies.\n\n\nUse the `--yes` option to skip prompts and apply default values, which work well for most setups:\n\n```bash\npnpm create next-app@latest --yes next-template\n```\n\nSee the full [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app) CLI reference.\n\n\n\n## Engine locking\n\nEnsuring consistent Node.js versions across environments is essential for predictable functionality in both development and production. Specify the supported Node.js version in your `package.json`:\n\n> At the time of writing this, Next.js requires Node.js version `18.18.x` or later.\n\n```json title=\"package.json\"\n{\n ...\n \"engines\": {\n \"node\": \">18.8.x\"\n }\n ...\n}\n```\n\nSecond, add a `.npmrc` file next to `package.json`. This `.npmrc` configuration enforces the Node.js version specified in `package.json`, preventing incompatible Node.js versions from being used.\n\n```ini title=\".npmrc\"\nengine-strict=true\n```\n\nThe [`engine-strict`](https://docs.npmjs.com/cli/v7/using-npm/config#engine-strict) setting tells your package manager to stop with an error on unsupported versions. This looks like:\n\n```txt frame=\"terminal\"\n ERR_PNPM_UNSUPPORTED_ENGINE Unsupported environment (bad pnpm and/or Node.js version)\n\nYour Node version is incompatible with \"next@15.0.3(react-dom@19.0.0-rc-66855b96-20241106(react@19.0.0-rc-66855b96-20241106))(react@19.0.0-rc-66855b96-20241106)\".\n\nExpected version: ^18.18.0 || ^19.8.0 || >= 20.0.0\nGot: v18.12.1\n\nThis is happening because the package's manifest has an engines.node field specified.\nTo fix this issue, install the required Node version.\n```\n\n> Let's create our first commit:\n>\n> ```bash\n> git commit -m 'build: added engine locking'\n> ```\n>\n> Ignore the commit message structure for now, since we will be talking about it later on in this guide. Or jump directly to it [Commitlint](#commitlint).\n\n## ESLint\n\n[ESLint](https://eslint.org/) already comes installed and pre-configured when creating a new Next.js project.\n\nLet's add a bit of extra configuration to make it stricter than the default settings. If you disagree with any of the rules it sets, no need to worry, it's very easy to disable any of them manually. Use `.eslintrc.json` in the root directory to configure these additional rules.\n\n```json title=\".eslintrc.json\"\n{\n \"extends\": [\"next/core-web-vitals\", \"next/typescript\"],\n \"rules\": {\n // don't allow console.log statements and only allow info, warn and error logs\n \"no-console\": [\"error\", { \"allow\": [\"info\", \"warn\", \"error\"] }]\n }\n}\n```\n\nNext, we will add [`eslint-plugin-tailwindcss`](https://github.com/francoismassart/eslint-plugin-tailwindcss) for linting our tailwind classes (check for correct order, check for contradicting classnames...).\n\n```bash\npnpm add -D eslint-plugin-tailwindcss @typescript-eslint/parser\n```\n\nThen update your `.eslintrc.json` file:\n\n```json title=\".eslintrc.json\" ins={5,9,11-16}\n{\n \"extends\": [\n \"next/core-web-vitals\",\n \"next/typescript\",\n \"plugin:tailwindcss/recommended\"\n ],\n \"rules\": {\n \"no-console\": [\"error\", { \"allow\": [\"info\", \"warn\", \"error\"] }],\n \"tailwindcss/classnames-order\": \"error\"\n },\n \"overrides\": [\n {\n \"files\": [\"*.ts\", \"*.tsx\", \"*.js\"],\n \"parser\": \"@typescript-eslint/parser\"\n }\n ]\n}\n```\n\n> Let's commit our changes\n>\n> ```bash\n> git commit -m 'build: added eslint rule + tailwindcss eslint plugin'\n> ```\n\n## Prettier\n\n[Prettier](https://prettier.io/) is an opinionated code formatter that automatically formats our files based on a predefined set of rules.\n\nIt is only used during development, so I'll add it as a `devDependency`:\n\n```bash\npnpm add --save-dev --save-exact prettier\n```\n\nNext, create a config file `.prettierrc` with the following content:\n\n```json title=\".prettierrc\"\n{\n \"endOfLine\": \"lf\",\n \"semi\": false,\n \"singleQuote\": false,\n \"tabWidth\": 2,\n \"trailingComma\": \"es5\"\n}\n```\n\n> For the full configuration reference, check out the [official documentation](https://prettier.io/docs/en/options).\n\nNext, create a `.prettierignore` file that lists the different directories/files we don't want prettier to format:\n\n```txt frame=\"code\" title=\".prettierignore\"\nnode_modules\n.next\n```\n\nAdd the following scripts to format files manually or check formatting status in CI environments:\n\n```json title=\"package.json\"\n{\n ...\n \"scripts\": {\n ...\n // format all files\n \"format\": \"prettier --write .\",\n // check if files are formatted, this is useful in CI environments\n \"format:check\": \"prettier --check .\"\n },\n ...\n}\n```\n\nYou can now run:\n\n```bash\n# format files\npnpm run format\n\n# check if files are formatted\npnpm run format:check\n```\n\n### Tailwind CSS\n\n\n If you are not using Tailwind CSS in your project you can skip to the [Sort\n imports](#sort-imports) ⬇️ section.\n\n\nIn order to automatically sort tailwind classes following the class order recommended by the tailwind team, we will be adding a new plugin to prettier called [`prettier-plugin-tailwindcss`](https://github.com/tailwindlabs/prettier-plugin-tailwindcss):\n\n```bash\npnpm add -D prettier-plugin-tailwindcss\n```\n\nAnd then update your `.prettierrc` file and add `plugins` property to it:\n\n```json title=\".prettierrc\" ins={7}\n{\n \"endOfLine\": \"lf\",\n \"semi\": false,\n \"singleQuote\": false,\n \"tabWidth\": 2,\n \"trailingComma\": \"es5\",\n \"plugins\": [\"prettier-plugin-tailwindcss\"]\n}\n```\n\n### Sort imports\n\nNext up, we will be adding [`@ianvs/prettier-plugin-sort-imports`](https://github.com/IanVS/prettier-plugin-sort-imports) to our prettier config , this will allow us to sort import declarations using RegEX order.\n\nFirst, install it as a `devDependency`:\n\n```bash\npnpm add -D @ianvs/prettier-plugin-sort-imports\n```\n\nThen, update your `.prettierrc` file to be as follows:\n\n```json title=\".prettierrc\" ins={9-28}\n{\n \"endOfLine\": \"lf\",\n \"semi\": false,\n \"singleQuote\": false,\n \"tabWidth\": 2,\n \"trailingComma\": \"es5\",\n \"plugins\": [\n \"prettier-plugin-tailwindcss\",\n \"@ianvs/prettier-plugin-sort-imports\"\n ],\n \"importOrder\": [\n \"^(react/(.*)$)|^(react$)\",\n \"^(next/(.*)$)|^(next$)\",\n \"\",\n \"\",\n \"^types$\",\n \"^@/types/(.*)$\",\n \"^@/config/(.*)$\",\n \"^@/lib/(.*)$\",\n \"^@/hooks/(.*)$\",\n \"^@/components/ui/(.*)$\",\n \"^@/components/(.*)$\",\n \"^@/styles/(.*)$\",\n \"^@/app/(.*)$\",\n \"\",\n \"^[./]\"\n ],\n \"importOrderParserPlugins\": [\"typescript\", \"jsx\", \"decorators-legacy\"]\n}\n```\n\n> Let's commit our changes\n>\n> ```bash\n> git commit -m 'build: added prettier setup + tailwindcss plugin + sort imports plugin'\n> ```\n\n## Git Hooks\n\n[Git hooks](https://git-scm.com/book/ms/v2/Customizing-Git-Git-Hooks) are scripts triggered at various stages in the Git workflow, ideal for enforcing code quality checks.\n\n### Husky 🐶\n\nWe are going to use a tool called [husky](https://typicode.github.io/husky/).\n\nHusky is a tool that makes it easier to use Git hooks, it provides a unified interface for managing hooks.\n\n#### Install husky\n\n```bash\npnpm add --save-dev husky\n```\n\n#### `husky init` command\n\nThe `init` command is the new and recommended way of setting up husky in your project. It creates a `pre-commit` script in `.husky/` and updates the `prepare` script in your `package.json`.\n\n```bash\npnpm exec husky init\n```\n\n \n\n```json title=\"package.json\"\n{\n ...\n \"scripts\": {\n ...\n \"prepare\": \"husky\"\n },\n ...\n}\n```\n\n#### Adding a New Hook\n\nAdding a new hook is as simple as creating a file, but first, delete the existing hook (`.husky/pre-commit`) since we will be adding our own later on.\n\nAdd a `pre-push` hook for building code:\n\n```bash\necho \"pnpm run build\" > .husky/pre-push\n```\n\n \n\n\nMost Git commands include a `-n/--no-verify` option to skip hooks\n\n```bash\ngit commit -m \"...\" -n # Skips Git hooks\n```\n\nDon't do it in front of your boss 🤫.\n\n\n\n### Commitlint\n\nAs you may have noticed from the previous commit messages, they follow a specific standard we call [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/), which is a lightweight convention on top of commit messages.\n\nFor more details, see [Why Use Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/#why-use-conventional-commits).\n\nNow to ensure that our commit messages follow this standard, we will use a tool called [commitlint](https://commitlint.js.org/), which acts as a linter for commit messages.\n\n#### Install commitlint\n\n```bash\npnpm add --save-dev @commitlint/{cli,config-conventional}\n```\n\n#### Configure commitlint to use conventional config\n\n```bash\necho '{ \"extends\": [\"@commitlint/config-conventional\"] }' > .commitlintrc\n```\n\nSince we want to lint our commit messages before they are created we will use [Husky's](#husky-) `commit-msg` hook.\n\n```bash\necho \"pnpm dlx commitlint --edit \\$1\" > .husky/commit-msg\n```\n\nLet's test the hook by commiting something that doesn't follow the rules, you should see something like this if everything works.\n\n```bash\ngit commit -m \"foo: this will fail\"\n\n⧗ input: foo: this will fail\n✖ type must be one of [build, chore, ci, docs, feat, fix, perf, refactor, revert, style, test] [type-enum]\n\n✖ found 1 problems, 0 warnings\n✖ Get help: https://github.com/conventional-changelog/commitlint/#what-is-commitlint\n\nhusky - commit-msg script failed (code 1)\n```\n\n### Lint staged\n\n[Lint staged](https://github.com/lint-staged/lint-staged) optimizes Git workflows by running linters on only staged files, improving speed and ensuring relevant files are checked before commiting.\n\nInstall lint staged.\n\n```bash\npnpm add -D lint-staged\n```\n\nThen create a new file in the root of your project and name it `.lintstagedrc.js` and paste the following code to it.\n\n```js\nconst path = require(\"path\")\n\nconst buildEslintCommand = (filenames) =>\n `next lint --fix --file ${filenames\n .map((f) => path.relative(process.cwd(), f))\n .join(\" --file \")}`\n\nconst prettierCommand = \"prettier --write\"\n\nmodule.exports = {\n \"*.{js,jsx,ts,tsx}\": [prettierCommand, buildEslintCommand],\n \"*.{json,css,md}\": [prettierCommand],\n}\n```\n\nThe configuration above tells `lint-staged` to run `next lint --fix` and `prettier --write` on files that match the specified patterns.\n\nFinally, let's add our `pre-commit` hook so that this runs everytime before creating a new commit.\n\n```bash\necho \"pnpm lint-staged\" > .husky/pre-commit\n```\n\n> Let's commit our changes\n>\n> ```bash\n> git commit -m 'build: added husky + commitlint + lint-staged setup'\n> ```\n\n## VS Code setup\n\n> I chose [VS code](https://code.visualstudio.com/) for this guide since it is the most used editor out there, I use [neovim](https://neovim.io/) btw...\n\n### Extensions\n\nMake sure you have these extensions installed:\n\n- [Prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode)\n- [ESLint](https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint)\n- [Tailwind CSS IntelliSense](https://marketplace.visualstudio.com/items?itemName=bradlc.vscode-tailwindcss)\n- [PostCSS Language Support](https://marketplace.visualstudio.com/items?itemName=csstools.postcss)\n\nThese VS Code extensions streamline code formatting, linting, and Tailwind utility usage directly in your editor.\n\n### Workspace Settings\n\nAfter configuring ESLint and prettier, it is time to make VS Code use them automatically. To do this first create `.vscode` directory in the root of your project and then add `settings.json` file and paste the following content to it.\n\n```json title=\"settings.json\"\n{\n \"editor.defaultFormatter\": \"esbenp.prettier-vscode\",\n \"editor.formatOnSave\": true\n}\n```\n\nThis will tell VS Code to use the Prettier extension as the default formatter and automatically format your files every time you save.\n\n\nRest here warrior, it has been a long ride but don't forget to commit your changes:\n\n```bash\ngit commit -m 'build: added vscode workspace settings'\n```\n\n\n\n\n\n## Recap\n\nTo summarise this whole article, here are some keypoints to keep in mind:\n\n- A solid setup accelerates software development, allowing you to focus more on business goals.\n- In team environments, these tools ensure consistency and uniformity, making collaboration smoother and more efficient.\n\nThat was it for me, PEACE ✌️."],"filePath":[0,"src/content/notes/setting-up-a-nextjs-project-with-essential-best-practices.mdx"],"digest":[0,"1496835a95721875"],"deferredRender":[0,true],"collection":[0,"notes"]}]]]}" client="idle" opts="{"name":"SearchCommand","value":true}" await-children="">
Command Palette
Search for a command to run...
Hey, I'm Ouassim
I'm a software engineer focused on delivering high-quality
products. I enjoy working where design and engineering intersect,
building software that's visually appealing and technically solid.