wrf-rust community guide
Last verified: 2026-04-01
CPU-first WSL 2 for Windows users 3 km severe-weather sweet spot ECMWF / GFS / HRRR / RAP

Get WRF running without burning a week on the wrong path.

This guide is written for weather-community users, especially people on Windows with gaming-PC specs who want a practical convection-allowing setup, good forcing options, realistic domain sizes, and a real escape route when the model, WPS, or WSL starts acting hostile. You can follow it manually from top to bottom or use the optional Codex skill as a copilot.

No AI required

The manual path is first-class. Codex is optional and should speed up the same workflow, not replace it.

Recommended first win

Single-domain 3 km, CPU-first, around 200 to 250 grid points each direction, short sanity run, then scale up.

Biggest beginner mistake

Running WRF from /mnt/c and leaving WSL at its default half-RAM setting.

Biggest hidden limiter

Output files. A big domain can ruin your SSD faster than the compile ruins your mood.

Entry Point

Choose your path

This guide is fully usable without AI. The manual route and the Codex-assisted route both land on the same WSL, build, init-data, run-order, and troubleshooting sections below.

Do it manually

Use the page itself as the workflow. Follow the stable first path, size WSL correctly, build CPU WRF first, choose your forcing, run a short sanity case, then branch into troubleshooting only if needed.

  1. Best for users who want plain instructions and commands.
  2. Best for Discord handholding where you want everyone reading the same page.
  3. No Codex install, prompts, or AI dependency required.

Use Codex as copilot

skills/wrf-community-onboarding

Use this if you want Codex to walk a user through the same workflow, help size hardware, explain forcing choices, and branch through common WRF and WSL failure cases.

mkdir -p ~/.codex/skills
cp -R skills/wrf-community-onboarding ~/.codex/skills/

After that, tell Codex to use the wrf-community-onboarding skill for WRF setup or troubleshooting help.

Manual Path

If this is your first run, do this exact manual path

  1. If you are on Windows, install WSL 2 with Ubuntu and keep all work under /home/<user>.
  2. Give WSL explicit RAM in %UserProfile%\.wslconfig. Do not trust the default.
  3. Build CPU WRF first. Build WPS second.
  4. Start with a single-domain 3 km run around 200 to 250 grid points each direction.
  5. Use ECMWF open data if you want the Euro path. Use GFS if you want the simplest public fallback.
  6. Run a short sanity case before any large event simulation.
  7. Validate the output with wrf-rust before scaling the domain, output frequency, or runtime.
  8. Only after that should you try larger domains or inner nests.

Why 3 km first

It is the right compromise for severe-weather users who want convection allowed behavior without the brutal stability and storage cost of going straight to 1 km or finer.

Why CPU first

A stable CPU baseline removes half the ambiguity from every later problem. If you skip that, every crash turns into a platform argument.

Why small first

Domain ambition grows faster than confidence. A short working run teaches more than a giant dead run with three broken stages.

Optional

Use the Codex skill as a copilot, not a prerequisite

The repo includes a dedicated skill so Codex can guide users through WSL, WRF/WPS setup, domain sizing, data-source choice, and failure triage without re-deriving the whole workflow every time. It should reinforce the manual path above, not replace it.

What the skill covers

skills/wrf-community-onboarding

It is self-contained and ships with reference files for Windows + WSL, hardware tiers, build order, init-data choices, and a branching troubleshooting guide.

Install the optional skill

mkdir -p ~/.codex/skills
cp -R skills/wrf-community-onboarding ~/.codex/skills/

Tell Codex to use the wrf-community-onboarding skill when you want help with setup, sizing, data choice, or issue triage.

Sizing

Hardware and WSL calculator

Estimate a sane first target

This is a community recommendation layer on top of local WRF evidence. It is designed to keep Windows users out of the common "big domain, not enough headroom" trap.

Host RAM in GB
Exact value

WSL memory

24 GB

WSL processors

12 threads

WSL swap

16 GB

Default 3 km size

220 to 300

Good first target

Usable starter tier. Stay CPU-first and keep output intervals conservative.

Aggressive ceiling: around 300

Put this in C:\Users\YOU\.wslconfig

In File Explorer, click the address bar, type %UserProfile%, and press Enter. If the file does not exist yet, create a new text file, rename it to .wslconfig, and make sure it is not really .wslconfig.txt.

[wsl2]
memory=24GB
processors=12
swap=16GB

Starter presets

Click one to repopulate the calculator, the preflight checker, and the wizard below with a proven starting point.

32 GB 3 km starter: the repo default for a short convection-allowing first run.

Readiness

Run the self-reported preflight before you burn time on a dead setup

This is intentionally honest. A GitHub Pages site cannot inspect your machine, so this checker is questionnaire-driven. Treat it like a launch checklist: answer it honestly, then fix the blockers it surfaces before you compile or stage data.

Expected first-time result: Build next or Needs work. That is normal. The point is to find the cheap fixes before WRF makes them expensive.

Checklist inputs

Build next

Readiness result

The platform looks sane enough. Finish the missing prep stages before you try a real case.

Generated .wslconfig

Save it at C:\Users\YOU\.wslconfig or %UserProfile%\.wslconfig. If Windows hides file extensions, turn them on first so you do not accidentally create .wslconfig.txt.

[wsl2]
memory=24GB
processors=12
swap=16GB

Discord help template

OS: Windows + WSL 2
Host RAM: 32 GB
Free disk: 200 GB
Workspace: /home/<user> Linux filesystem
Source: ECMWF open data
Target domain: 200 x 200 at 3 km
WRF built: no
WPS built: no
WPS geog present: no
Last successful step:
Exact error:

Windows

WSL 2 setup for weather-model users

Microsoft documents the current wsl --install flow for Windows 11 and Windows 10 version 2004+ (build 19041+). For WRF work, WSL 2 is the path you want.

Admin PowerShell

wsl --install -d Ubuntu
wsl --update
wsl --status
wsl -l -v

If installation stalls at 0.0%, Microsoft also documents:

wsl --install --web-download -d Ubuntu

Do not build here

/mnt/c/Users/you/wrf-work

Build here instead:

/home/you/wrf-work

Microsoft explicitly recommends keeping Linux-side project files in the Linux filesystem for the best performance.

Where does .wslconfig go?

It is a Windows-side file in your user profile, not inside Ubuntu. The normal path is C:\Users\YOUR_NAME\.wslconfig.

Easiest way to get there: open File Explorer, click the address bar, type %UserProfile%, and press Enter.

What if the file is missing?

That is normal on a new machine. Create it yourself, then paste the config block from this page into it and save.

notepad $env:UserProfile\.wslconfig

Most common beginner mistake

Windows creates .wslconfig.txt because file extensions were hidden. Turn on View > Show > File name extensions in File Explorer, then rename the file so it is exactly .wslconfig.

Default WSL memory

50%

WSL 2 defaults to about half your Windows RAM unless you override it.

Default WSL swap

25%

Swap defaults to about a quarter of host RAM, rounded up, which is not always enough for WRF prep + run spikes.

Best file bridge

\\wsl$

Use Windows Explorer through \\wsl$ instead of building inside the Windows filesystem.

Fast BIOS / virtualization triage

If you hit 0x80370102, the first suspects are BIOS virtualization, Virtual Machine Platform, or hypervisor launch state.

systeminfo.exe
bcdedit /enum | findstr -i hypervisorlaunchtype

When WSL refuses to behave

If you see 0x80070003, Microsoft documents that WSL must live on the system drive, usually C:. Also check that the distro storage is not compressed or encrypted on NTFS.

Why did my .wslconfig change do nothing?

Because WSL was never fully stopped. After editing %UserProfile%\.wslconfig, run wsl --shutdown and wait a few seconds before reopening Ubuntu.

More Windows-side stuck points

If you see 0x8007019e, the WSL Windows feature itself is usually not enabled yet. If WSL has no internet, restart it first, then check Microsoft's DNS/firewall troubleshooting guidance.

I cannot find where to put the .wslconfig file

Fastest path

  1. Open File Explorer.
  2. Click the address bar.
  3. Type %UserProfile% and press Enter.
  4. Look for a file named .wslconfig.
  5. If it is not there, create it yourself.

One-command shortcut

notepad $env:UserProfile\.wslconfig

If the file does not exist, Notepad will offer to create it.

I saved it, but Windows made a .txt file

Likely root cause

File extensions were hidden, so the real filename became .wslconfig.txt.

Fastest actions

  • In File Explorer, turn on View > Show > File name extensions.
  • Rename the file so it is exactly .wslconfig.
  • Reopen it and confirm the contents still look correct.
I saved the file, but WSL still uses the old memory limit

Likely root cause

WSL was not fully stopped, so the new config never got applied.

Fastest actions

wsl --shutdown

Wait a few seconds, then reopen Ubuntu and continue. If it still does not apply, confirm the filename is really .wslconfig and that it lives in %UserProfile%.

Build

Build WRF and WPS the low-drama way

Stable recommendation: build CPU WRF first, then WPS, then run a short case.

Community fast path on Ubuntu / WSL

sudo apt update
sudo apt install -y \
  build-essential gcc g++ gfortran make m4 perl csh tcsh flex bison \
  curl file git cmake \
  libnetcdf-dev libnetcdff-dev \
  libopenmpi-dev openmpi-bin \
  zlib1g-dev libpng-dev

This is the practical shortcut. If the compile becomes weird, fall back to the stricter official library-build workflow from the WRF compile tutorial.

WRF first

git clone https://github.com/wrf-model/WRF.git
cd WRF
export NETCDF=/usr
export WRFIO_NCD_LARGE_FILE_SUPPORT=1
./configure
./compile em_real -j "$(nproc)"

Choose the GNU serial or GNU dmpar option that matches your intent. Do not rely on fixed menu numbers, they vary between versions and environments.

WPS second

git clone https://github.com/wrf-model/WPS.git
cd WPS
./configure --build-grib2-libs
export WRF_DIR=/path/to/WRF
./compile

For WPS v4.4+, the internal GRIB2-libs build path is the simplest official modern route.

Do not forget static geography

Download the WPS geographical static data from the official NCAR download page before you expect geogrid.exe to be useful. The highest-resolution mandatory package is the normal choice and is about 29 GB uncompressed.

Put it somewhere stable and point geog_data_path in namelist.wps at that directory.

Success conditions

Do not move on until you have:

  • main/wrf.exe
  • main/real.exe
  • geogrid.exe
  • ungrib.exe
  • metgrid.exe

First WPS gate

Once WPS is built and the geography is in place, the very next thing is getting geogrid.exe to run cleanly. Do not skip that and jump straight to GRIB input.

Generator

Use the starter wizard to generate a sane first case

This wizard is deliberately narrow. It only targets the same single-domain, CPU-first, convection-allowing `3 km` baseline the rest of this guide recommends. The point is to remove decisions, not invent a thousand knobs.

The default ECMWF preset here is designed to match the repo starter pack exactly. If you change source, domain size, or `num_metgrid_*`, the output becomes your custom case and you should treat it that way.

Case inputs

ECMWF open data 32 GB 3 km starter

Case summary

6-hour ECMWF case, 200 x 200 at 3 km, centered on 35.0, -97.0.

Projection: Lambert, truelat1 30.0, truelat2 60.0, stand_lon -97.0.

Validation

The wizard will show errors and warnings here.

Source and Vtable plan

Use: Vtable.ECMWF_opendata

    Generated namelist.wps

    Download
    &share
     wrf_core               = 'ARW',
     max_dom                = 1,
     start_date             = '2026-04-03_00:00:00',
     end_date               = '2026-04-03_06:00:00',
     interval_seconds       = 3600,
     io_form_geogrid        = 2,
    /

    Generated namelist.input

    Download
    &time_control
     run_days                            = 0,
     run_hours                           = 6,
     run_minutes                         = 0,
     run_seconds                         = 0,
    /

    Starter Pack

    Start from proven file patterns and adapt them carefully

    The repo now includes a starter file pack built from two things: a locally verified single-domain `3 km` CPU baseline, and a community member's durable ECMWF/open-data workflow patterns. The point is to reuse the good structure without pretending one exact case file is universal. The default ECMWF wizard preset above is designed to render these starter files exactly.

    namelist.wps.ecmwf-starter

    Single-domain `3 km` WPS starter with the community split-prefix `metgrid` pattern for pressure, surface, and soil streams.

    namelist.input.ecmwf-starter

    Single-domain `3 km` convection-allowing example with community-style severe-weather physics and conservative CPU-first sizing.

    Vtable.ECMWF_opendata

    Custom `ungrib.exe` table for ECMWF open-data GRIB fields. This is one of the highest-value community files because the wrong `Vtable` can quietly wreck the whole prep chain.

    Read the starter-files README before using them. Edit the dates, domain center, geog path, and `num_metgrid_*` counts first. These files are a strong starting point, not a promise that your case can skip thinking. If you want the same defaults without hand-editing first, use the wizard above and then copy or download the generated files.

    Forcing

    Pick initialization data without getting lost in portal hell

    Best default Global As of 2026-04-01

    ECMWF open data

    Best choice when someone says "I want Euro data." The current real-time open-data entry point is the Free & Open Data Portal at data.ecmwf.int/forecasts/.

    API key? Usually no for the normal hobbyist real-time open-data path. That is the most important clarification on this page.

    The API-key confusion usually comes from mixing the open-data portal with CDS archive-style access. Those are different workflows.

    Universal fallback Global No key

    GFS via NOMADS

    If you want the simplest public fallback that works almost everywhere, use GFS from NOMADS.

    Common files are in the gfs.tCCz.pgrb2.0p25.fFFF family.

    Fast refresh CONUS No key

    HRRR and RAP

    Use HRRR for CONUS storm work when you want a fast-refresh model. Use RAP when you want a broader regional fast-refresh option.

    HRRR file families commonly include wrfprsf, wrfnatf, and wrfsfcf.

    Useful community pattern for ECMWF + WPS: if you split ECMWF forcing into pressure, surface, and soil GRIB streams, a common WPS pattern is to run ungrib.exe separately with prefixes like FILE, SFILE, and SOILFILE, then point metgrid.exe at them with fg_name = 'SFILE', 'SOILFILE', 'FILE'.

    Also expect some ECMWF GRIB2 downloads to arrive with CCSDS compression that can trip up ungrib.exe. If that happens, a practical workaround is decompressing them first with cdo -f grb2 copy input.grib2 output.grib2.

    If you find community Python snippets for this workflow, treat them as patterns, not drop-in tooling. They often depend on a larger project wrapper, custom helper functions, and local path conventions.

    Official 2024+ ECMWF note: if recent ECMWF GRIB2 files still fail in WPS, the WRF input-data FAQ documents a conversion step with grib_set -r -w packingType=grid_ccsds -s packingType=grid_simple input.grib2 output.grib2, then use Vtable.ECMWF from WPS v4.6+.

    Source Best for Key question Plain answer
    ECMWF open data Global synoptic setup and "Euro" forcing Do I need an API key? For the open real-time portal, usually no. For CDS/archive workflows, yes.
    GFS Simple public fallback Can I get this easily? Yes. NOMADS is the default public route.
    HRRR CONUS short-range storm work Should I start here? Only if your use case is regional and fast-refresh focused.
    RAP Broader North American fast refresh When do I choose it? When HRRR is too narrow and you still want fast-refresh behavior.

    3 km

    How big a domain should you actually try?

    Host RAM Recommended first 3 km domain Aggressive ceiling Reality check
    16 GB 180 to 220 grid points Not a happy place Below the pleasant range for big real-data convection-allowing WRF on Windows.
    24 to 32 GB 220 to 300 grid points Around 300 Usable starter tier if WSL memory is set and output is restrained.
    48 to 64 GB 300 to 350 grid points Around 350 Best value tier for a hobbyist 3 km single-domain setup.
    96 GB and up 350 to 400 grid points Around 400 Now disk and output management become just as important as RAM.

    Verified local 3 km anchor

    200 x 200 x 80

    Allocated about 1.86 GB for the domain in a working local baseline.

    Verified local 1 km anchor

    400 x 400 x 80

    Allocated about 6.99 GB in a very aggressive local stress case.

    Verified storage warning

    56 GB

    A single idealized stress output file reached this size. Output can become the real limiter.

    Translation: the giant 3 km domain is usually not the first thing that should happen. The first thing that should happen is a clean small run, because then you know your pipeline is real.

    Run Order

    The only pipeline that matters

    geogrid.exe
    ->
    ungrib.exe
    ->
    metgrid.exe
    ->
    real.exe
    ->
    wrf.exe
    ->
    wrf-rust sanity check

    After real.exe

    • wrfinput_d01 exists
    • wrfbdy_d01 exists
    • Dates match the forcing window
    • No silent missing-file or bad-path issue

    After wrf.exe

    • rsl.out.0000 ends with success
    • CFL is not exploding from the start
    • Output files are appearing at the interval you expected

    Fast structural validation with wrf-rust

    pip install wrf-rust
    python -m wrf info wrfout_d01_YYYY-MM-DD_HH:MM:SS
    python -m wrf stats wrfout_d01_YYYY-MM-DD_HH:MM:SS sbcape slp temp

    What this does not prove

    A file opening successfully does not prove the meteorology is good. It proves the pipeline produced a coherent output artifact. That is still a valuable first gate.

    Disk

    Output budgeting matters earlier than people think

    Start with longer intervals

    For the first real-data test, use conservative history_interval values. You do not need five-minute output to prove the pipeline works.

    Keep the first run short

    Run a short window first. A six-hour good run teaches more than a dead 24-hour giant run.

    Watch free space live

    Check df -h before launch. If the domain or output interval is aggressive, keep watching while the run is alive.

    Local evidence from the workspace includes a single idealized output file near 56 GB. Do not wait until the SSD is almost full to think about output strategy.

    Branching

    Troubleshooting tree

    If you see this Most likely problem First thing to do
    0x80370102 Virtualization / hypervisor path is not actually ready Check BIOS virtualization and hypervisorlaunchtype
    Problem opening file Vtable Wrong or missing `Vtable` symlink Link the correct file to the literal name Vtable
    unable to open GRIBFILE.AAA GRIB files were not linked the way ungrib.exe expects Use link_grib.csh
    mandatory field TT was not found Bad Vtable, wrong source family, or wrong fg_name Recheck the upstream data family and intermediate prefixes
    real.exe made no wrfinput_d01 Date window or met_em* mismatch Match dates all the way from forcing to namelist.input
    Build fails before you ever get wrf.exe

    Likely root cause

    The first real compile error matters. Most later errors are just fallout.

    Fastest checks

    • Search the compile log for the first capital-E Error.
    • If you see Corrupt or Old Module file, treat that as the root, not the later missing objects.
    • Confirm compiler choice and netCDF path are consistent.
    • If netcdf.inc is missing, make sure netCDF-Fortran is really installed in the prefix you are using.

    Fastest actions

    • Run ./clean -a.
    • If it still looks polluted, rebuild from a fresh tree.
    • Do not start debugging WPS until WRF itself builds.
    WPS behaves strangely or geogrid.exe / ungrib.exe / metgrid.exe are missing

    Likely root cause

    WRF did not actually build cleanly, or WPS was built with a different compiler or different netCDF path.

    Fastest checks

    • Make sure WRF already built successfully.
    • Make sure WRF and WPS use the same compiler family.
    • Make sure WRF_DIR points at the compiled WRF tree.
    • For WPS v4.4+, prefer ./configure --build-grib2-libs.
    • If unresolved GOMP_* or __kmpc_* symbols appear, simplify the build path before doing anything exotic.
    geogrid.exe fails with missing static data or bad nest dimensions

    Common signatures

    • Could not open .../WPS_GEOG/.../index
    • dimension errors involving parent_grid_ratio

    Fastest actions

    • Use an absolute geog_data_path.
    • Confirm the WPS geog directory actually exists and is complete.
    • Recheck nest sizes and parent ratios before touching anything else.
    ungrib.exe fails with GRIBFILE.AAA, missing Vtable, or Data not found

    Common signatures

    • edition_num: unable to open GRIBFILE.AAA
    • Problem opening file Vtable
    • Data not found: YYYY-MM-DD_HH:MM:SS.0000

    Fastest actions

    • Use link_grib.csh so the files are linked as GRIBFILE.AAA, GRIBFILE.AAB, and so on.
    • Symlink the correct file to the literal name Vtable.
    • Confirm the GRIB files actually contain the dates you asked for.
    • If this is recent ECMWF GRIB2, try the official grib_set conversion rule before doing anything exotic.
    metgrid.exe fails with missing TT or Screwy NDATE

    Common signatures

    • The mandatory field TT was not found in any input data
    • ERROR: Screwy NDATE: 0000-00-00_00:00:00

    Fastest actions

    • Confirm the fg_name prefixes match the files produced by ungrib.exe.
    • Confirm ungrib.exe produced the expected intermediate files for every needed field family.
    • Recheck domain dates in namelist.wps.
    • Use rd_intermediate.exe if you need to inspect what WPS actually produced.
    real.exe did not make wrfinput_d01 or wrfbdy_d01

    Likely root cause

    Missing met_em* files, wrong dates, bad run directory, or inconsistent I/O environment.

    Fastest checks

    • Confirm met_em* exists where real.exe expects it.
    • Confirm your start_* and end_* values fit inside the forcing window.
    • Confirm the run directory contains the right tables and executables.
    wrf.exe dies immediately

    Known local pattern

    There is local evidence of ierr=-1021 while opening wrfinput_d01, which is the kind of error that usually means the run directory or prep artifacts are wrong before the model ever gets to the interesting part.

    Fastest checks

    • Make sure wrfinput_d01 and wrfbdy_d01 are present and readable.
    • Make sure you are running from the intended directory.
    • Check disk space and WSL memory before assuming the namelist is cursed.
    wrf.exe runs, but CFL explodes

    How to read this

    If tens of thousands of points exceed CFL from the start, a completed run can still be garbage. Local 1 km evidence showed immediate severe w-cfl spikes even though the run technically finished.

    Fastest actions

    • Reduce time_step.
    • Shrink the domain.
    • Return to a simpler, known-good physics setup.
    • Shorten the forecast window until the setup is stable.
    Disk space vanishes while the run is alive

    Likely root cause

    Output cadence is too aggressive for the domain size and run length.

    Fastest actions

    • Increase history_interval.
    • Shorten the run.
    • Reduce fields if you know how to use I/O control.
    • Check df -h before launching the next attempt.

    FAQ

    Fast answers to the questions people actually ask

    Do I need an ECMWF API key?

    As of 2026-04-01, not usually for the common real-time open-data portal path. API keys still matter for CDS or archive-style access.

    Can I run 3 km WRF on 16 GB RAM?

    You can run something, but it is below the pleasant range for the kind of severe-weather setup most people here want.

    Should I start with HRRR instead?

    Only if your problem is specifically CONUS and short-range. HRRR is not the universal default.

    Should I build from /mnt/c?

    No. Keep the whole workflow inside the Linux filesystem and access it from Windows through \\wsl$.

    Where do I put .wslconfig?

    In your Windows user folder, usually C:\Users\YOUR_NAME\.wslconfig. It does not live inside Ubuntu.

    Why does ungrib.exe say GRIBFILE.AAA or Vtable is missing?

    Because it expects GRIB files to be linked to GRIBFILE.* names and it expects a file literally named Vtable in the WPS run directory.

    What should I use this repo for after the run?

    Use wrf-rust to inspect the output fast, confirm the file is healthy, and pull the fields you care about without a heavyweight Python stack.

    Sources

    Primary references used for the current parts of this guide

    1. Microsoft Learn: Install WSL
    2. Microsoft Learn: Set up a WSL development environment
    3. Microsoft Learn: Advanced settings configuration in WSL
    4. Microsoft Learn: Basic commands for WSL
    5. Microsoft Learn: Troubleshooting WSL
    6. Microsoft Learn: Hyper-V host hardware requirements
    7. WRF Model docs: compiling WRF and WPS
    8. WRF Users Guide
    9. NCAR: WPS geographical static data downloads
    10. WRF Model docs: WPS users guide
    11. WRF Model docs: input data FAQ
    12. WRF Model docs: complete installation of WRF and WPS
    13. WRF tutorial: troubleshooting exercise
    14. WRF Model docs: namelist.wps best practices
    15. ECMWF Free & Open Data Portal
    16. ECMWF open data: real-time forecasts from IFS and AIFS
    17. ECMWF: registration vs anonymous access
    18. ECMWF: entire real-time catalogue open from 2025-10-01
    19. ECMWF CDS API setup
    20. NOAA NOMADS
    21. NCEP GFS products
    22. NCEP HRRR products
    23. NCEP RAP products
    24. NOAA Rapid Refresh / HRRR