Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
9c64e7a
Initialize DRC CI Flow
AL-255 Apr 29, 2026
cbe5942
Initialize DRC CI Flow: Minor fixes
AL-255 Apr 29, 2026
ad0708a
Initialize DRC CI Flow: Minor fixes, adding ci default parameters
AL-255 Apr 29, 2026
099bb78
Adding fixes for current_mirror_interdigitized_netlist's missing fing…
AL-255 Apr 29, 2026
120b7c5
Improving DRC CI flow to save the correct artifacts for the future LV…
AL-255 Apr 30, 2026
abfcc91
Initialize LVS CI Flow
AL-255 Apr 30, 2026
42e2b23
Improving DRC CI flow: Adding support for MAGIC (not recommended)
AL-255 Apr 30, 2026
149b19e
Enabling GF180 LVS CI; Fixing DRC/LVS low hanging fruits
AL-255 Apr 30, 2026
764e5ca
Fixing remaining opamp DRC violations
AL-255 Apr 30, 2026
9822c60
Adding missing p-guardring connections for LVS
AL-255 Apr 30, 2026
c0f7694
Fixing LVCM and FVF DRC violation in gf180
AL-255 Apr 30, 2026
efbd268
Minor: LVS run should produce a report, fixing permission error
AL-255 Apr 30, 2026
1cffda7
Bug fixes, diff_pair_cmirror_bias now passes LVS
AL-255 May 3, 2026
1bd12b0
apt can sometimes fail, in this case, the CI should auto retry
AL-255 May 3, 2026
af68670
apt can sometimes fail, in this case, the CI should auto retry (Minor…
AL-255 May 3, 2026
01402ef
Deadsnakes is too unreliable, switch drc.yml + lvs.yml to uv-installe…
AL-255 May 3, 2026
41833ca
Improving CI flow
AL-255 May 3, 2026
bff17d5
Major opamp fix, drc clean in gf180 and sky130
AL-255 May 4, 2026
56168a2
Fixing all the remaining LVS errors in the opamp for sky130, now sky1…
AL-255 May 4, 2026
b1e785f
Moving GF180 LVS to kLayout instead of MAGIC
AL-255 May 4, 2026
9a3fcf3
Preparing for GF180 opamp LVS fix
AL-255 May 4, 2026
7cf1d1e
Fixing GF180 LVS flow on Github Actions
AL-255 May 4, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
148 changes: 148 additions & 0 deletions .github/workflows/drc.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
name: Cell DRC

on:
push:
branches: [main]
pull_request:
workflow_dispatch:

# Cancel superseded runs on the same branch — pushing N commits in a row
# shouldn't keep N CI runs going.
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

jobs:
drc:
name: DRC (${{ matrix.pdk }})
runs-on: ubuntu-22.04
timeout-minutes: 30

# iic-osic-tools ships klayout, magic, netgen, and the sky130A / gf180mcuD
# PDKs pre-installed under /foss/pdks. The image is Ubuntu 24.04 with only
# Python 3.12, but glayout pins gdsfactory<=7.7.0 / numpy<=1.24, so we
# install Python 3.10 via uv (python-build-standalone, hosted on GitHub
# releases) and run glayout in a venv. We previously used the deadsnakes
# PPA but ppa.launchpadcontent.net was too flaky for CI.
#
# See https://github.com/iic-jku/iic-osic-tools.
container:
image: hpretl/iic-osic-tools:latest
options: --user root
# The image's entrypoint launches a UI manager; bypass it.
env:
PDK_ROOT: /foss/pdks
DEBIAN_FRONTEND: noninteractive
PYTHONUNBUFFERED: "1"
# The image sets PYTHONPATH to its 3.12 site-packages, which breaks
# python3.10 if inherited.
PYTHONPATH: ""
# GitHub Actions overrides the image's ENTRYPOINT with `tail -f`, so
# the iic-osic-tools entrypoint that normally enriches PATH with
# /foss/tools/{bin,klayout,...} never runs. Set it explicitly here
# so klayout/magic/etc. are on PATH for every step.
PATH: /foss/tools/bin:/foss/tools/sak:/foss/tools/kactus2:/foss/tools/klayout:/foss/tools/osic-multitool:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

strategy:
fail-fast: false
matrix:
pdk: [sky130, gf180]

# The iic-osic-tools image's default container shell is dash (`sh`), so
# `set -o pipefail` blows up. Force bash for every `run:` step.
defaults:
run:
shell: bash

steps:
- uses: actions/checkout@v4

- name: Cache uv + CPython 3.10
id: cache-uv
uses: actions/cache@v4
with:
# iic-osic-tools sets HOME=/headless even when running as root, so
# uv installs land here regardless of who launched the container.
path: |
/headless/.local/bin/uv
/headless/.local/bin/uvx
/headless/.local/share/uv
key: uv-py310-${{ runner.os }}-v1

- name: Install Python 3.10 (uv)
run: |
set -euxo pipefail
# uv installs CPython from python-build-standalone (GitHub releases),
# bypassing launchpad PPAs entirely. Skip the curl install when the
# cache already restored uv. `uv python install 3.10` is idempotent
# (no-op if 3.10 is already present in $UV_PYTHON_INSTALL_DIR).
if [ ! -x "$HOME/.local/bin/uv" ]; then
curl -LsSf https://astral.sh/uv/install.sh | sh
fi
echo "$HOME/.local/bin" >> "$GITHUB_PATH"
export PATH="$HOME/.local/bin:$PATH"
uv python install 3.10
# Resolve the absolute path so later steps don't depend on PATH.
echo "PYTHON310=$(uv python find 3.10)" >> "$GITHUB_ENV"

- name: Show tool versions
run: |
set -euxo pipefail
klayout -v
"$PYTHON310" --version
ls "$PDK_ROOT"

- name: Cache python venv
id: cache-venv
uses: actions/cache@v4
with:
path: ${{ github.workspace }}/.venv
# Bust the cache when setup.py changes (deps) or the bundled DRC
# decks change (paths embedded in glayout's editable install).
# v2 = switched interpreter from deadsnakes to uv; venvs cached under
# v1 symlink to a now-absent /usr/bin/python3.10.
key: drc-venv-py310-${{ runner.os }}-${{ hashFiles('setup.py', 'src/glayout/**/*.py') }}-v2
restore-keys: |
drc-venv-py310-${{ runner.os }}-

- name: Create venv and install glayout (cache miss)
if: steps.cache-venv.outputs.cache-hit != 'true'
run: |
set -euxo pipefail
rm -rf "$GITHUB_WORKSPACE/.venv"
"$PYTHON310" -m venv "$GITHUB_WORKSPACE/.venv"
. "$GITHUB_WORKSPACE/.venv/bin/activate"
# uv pip install is ~3-5x faster than pip for cold installs and
# picks up $VIRTUAL_ENV automatically after `activate`.
uv pip install -e .

# No "refresh editable install" step on cache hit: glayout's .pth points
# to $GITHUB_WORKSPACE which is stable across runs, so the restored venv
# imports the freshly checked-out source as-is.

- name: Run cell DRC
run: |
set -euxo pipefail
. "$GITHUB_WORKSPACE/.venv/bin/activate"
python tests/drc/run_cell_drc.py \
--pdk ${{ matrix.pdk }} \
--out-dir drc_results/${{ matrix.pdk }}

- name: Upload DRC artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: drc-${{ matrix.pdk }}
path: drc_results/${{ matrix.pdk }}
retention-days: 14

- name: Publish JUnit summary
# Skip when junit.xml wasn't produced (e.g. setup died before the
# runner could write it) — otherwise the publisher emits a misleading
# second red check on top of the real failure.
if: ${{ always() && hashFiles(format('drc_results/{0}/junit.xml', matrix.pdk)) != '' }}
uses: mikepenz/action-junit-report@v4
with:
report_paths: drc_results/${{ matrix.pdk }}/junit.xml
check_name: DRC report (${{ matrix.pdk }})
require_tests: true
174 changes: 174 additions & 0 deletions .github/workflows/lvs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,174 @@
name: "Automated: Cell LVS"

# Triggered automatically when the DRC workflow finishes (success OR failure).
# DRC produces the GDS + reference netlists, so LVS does not have to rebuild
# the cells — it just downloads the DRC artifacts and runs netgen.
#
# Also runnable on demand to re-run LVS against the latest DRC artifact.
on:
workflow_run:
workflows: ["Cell DRC"]
types: [completed]
workflow_dispatch:
inputs:
drc_run_id:
description: "GitHub Actions run id of the DRC workflow whose artifacts to consume (defaults to latest successful run)."
required: false

# Cancel superseded LVS runs on the same source branch. workflow_run-triggered
# runs report github.ref as the default branch (where this file lives), so we
# fall back to the triggering DRC run's head_branch when present.
concurrency:
group: ${{ github.workflow }}-${{ github.event.workflow_run.head_branch || github.ref }}
cancel-in-progress: true

jobs:
lvs:
name: LVS (${{ matrix.pdk }})
runs-on: ubuntu-22.04
timeout-minutes: 30
# Skip when this run was kicked off by a DRC failure — DRC artifacts may
# still be partial, but we DO want to run LVS on the cells that passed
# DRC, so allow both success and failure conclusions.
if: ${{ github.event_name != 'workflow_run' || github.event.workflow_run.conclusion != 'cancelled' }}

# download-artifact across workflow runs needs actions:read; the JUnit
# publisher (mikepenz/action-junit-report) needs checks:write to post the
# report check. workflow_run uses the parent's token, which is read-only
# by default once any `permissions:` block is declared, so checks:write
# must be granted explicitly — otherwise the publish step errors with
# "Resource not accessible by integration" and no report shows in the UI.
permissions:
contents: read
actions: read
checks: write

container:
image: hpretl/iic-osic-tools:latest
options: --user root
env:
PDK_ROOT: /foss/pdks
DEBIAN_FRONTEND: noninteractive
PYTHONUNBUFFERED: "1"
PYTHONPATH: ""
PATH: /foss/tools/bin:/foss/tools/sak:/foss/tools/kactus2:/foss/tools/klayout:/foss/tools/osic-multitool:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

strategy:
fail-fast: false
matrix:
# sky130 uses magic+netgen (via `pdk.lvs_netgen`); gf180 uses the
# bundled gf180mcu klayout LVS deck — magic+netgen mis-extracts the
# gf180 substrate (NMOS bulks merge into VDD via the n-well), so we
# drive the PDK's own run_lvs.py instead. The dispatch happens in
# tests/lvs/run_cell_lvs.py based on --pdk; both branches use the
# same DRC artifact and write the same summary.json/junit.xml shape.
pdk: [sky130, gf180]

defaults:
run:
shell: bash

steps:
- uses: actions/checkout@v4

- name: Download DRC artifact (drc-${{ matrix.pdk }})
uses: actions/download-artifact@v4
with:
name: drc-${{ matrix.pdk }}
path: drc_inputs/${{ matrix.pdk }}
# When triggered by workflow_run, pull the artifact from that run.
# Falls back to the current run when triggered manually.
run-id: ${{ github.event.workflow_run.id || inputs.drc_run_id || github.run_id }}
github-token: ${{ secrets.GITHUB_TOKEN }}

- name: Cache uv + CPython 3.10
id: cache-uv
uses: actions/cache@v4
with:
path: |
/headless/.local/bin/uv
/headless/.local/bin/uvx
/headless/.local/share/uv
key: uv-py310-${{ runner.os }}-v1

- name: Install Python 3.10 (uv)
run: |
set -euxo pipefail
# uv installs CPython from python-build-standalone (GitHub releases),
# bypassing launchpad PPAs entirely. Skip the curl install when the
# cache already restored uv. `uv python install 3.10` is idempotent.
if [ ! -x "$HOME/.local/bin/uv" ]; then
curl -LsSf https://astral.sh/uv/install.sh | sh
fi
echo "$HOME/.local/bin" >> "$GITHUB_PATH"
export PATH="$HOME/.local/bin:$PATH"
uv python install 3.10
echo "PYTHON310=$(uv python find 3.10)" >> "$GITHUB_ENV"

- name: Show tool versions
run: |
set -euxo pipefail
klayout -v
magic -d null -noconsole -T minimum </dev/null 2>&1 | head -5 || true
netgen -batch lvs -version 2>&1 | head -3 || true
"$PYTHON310" --version
ls "$PDK_ROOT"
# gf180 only: surface the resolved klayout LVS deck path so a
# PDK install hiccup shows up in the log instead of a cryptic
# FileNotFoundError later in the run step.
if [ "${{ matrix.pdk }}" = "gf180" ]; then
ver=$(cat "$PDK_ROOT/ciel/gf180mcu/current")
ls "$PDK_ROOT/ciel/gf180mcu/versions/$ver/gf180mcuD/libs.tech/klayout/tech/lvs/run_lvs.py"
fi

- name: Cache python venv
id: cache-venv
uses: actions/cache@v4
with:
path: ${{ github.workspace }}/.venv
# v2 matches drc.yml: switched interpreter from deadsnakes to uv.
key: drc-venv-py310-${{ runner.os }}-${{ hashFiles('setup.py', 'src/glayout/**/*.py') }}-v2
restore-keys: |
drc-venv-py310-${{ runner.os }}-

- name: Create venv and install glayout (cache miss)
if: steps.cache-venv.outputs.cache-hit != 'true'
run: |
set -euxo pipefail
rm -rf "$GITHUB_WORKSPACE/.venv"
"$PYTHON310" -m venv "$GITHUB_WORKSPACE/.venv"
. "$GITHUB_WORKSPACE/.venv/bin/activate"
uv pip install -e .

# No "refresh editable install" step on cache hit — see drc.yml.

- name: Sanity-check DRC inputs
run: |
set -euxo pipefail
ls -la drc_inputs/${{ matrix.pdk }}/gds || { echo "no gds/ in DRC artifact"; exit 1; }
ls -la drc_inputs/${{ matrix.pdk }}/netlists || { echo "no netlists/ in DRC artifact"; exit 1; }

- name: Run cell LVS
run: |
set -euxo pipefail
. "$GITHUB_WORKSPACE/.venv/bin/activate"
python tests/lvs/run_cell_lvs.py \
--pdk ${{ matrix.pdk }} \
--inputs-dir drc_inputs/${{ matrix.pdk }} \
--out-dir lvs_results/${{ matrix.pdk }}

- name: Upload LVS artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: lvs-${{ matrix.pdk }}
path: lvs_results/${{ matrix.pdk }}
retention-days: 14

- name: Publish JUnit summary
if: ${{ always() && hashFiles(format('lvs_results/{0}/junit.xml', matrix.pdk)) != '' }}
uses: mikepenz/action-junit-report@v4
with:
report_paths: lvs_results/${{ matrix.pdk }}/junit.xml
check_name: LVS report (${{ matrix.pdk }})
require_tests: true
15 changes: 15 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -244,3 +244,18 @@ cython_debug/
# refer to https://docs.cursor.com/context/ignore-files
.cursorignore
.cursorindexingignore
/.drc-cache/
.cursor/
/.claude/

# magic / netgen extract artifacts that get dropped in the cwd by lvs_netgen
*.ext
*.nodes
*.res.ext
*.lvsmag
*.sim
.*.*
.*/
_*.json
out/
drc_results/
9 changes: 9 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,15 @@
"pandas>1.3.0,<=2.3.0",
"matplotlib>3.4.0,<=3.10.0",
"klayout>0.28.0,<=0.29",
# `docopt` is imported by gf180mcu's bundled `run_lvs.py` (under
# `$PDK_ROOT/ciel/gf180mcu/versions/<hash>/gf180mcuD/libs.tech/
# klayout/tech/lvs/run_lvs.py`). The gf180 LVS dispatch in
# `tests/lvs/klayout_gf180.py` execs that script via the active
# python3, so docopt must be importable from the venv that runs
# LVS — otherwise every gf180 LVS report contains only
# `ModuleNotFoundError: No module named 'docopt'` and the deck
# never runs.
"docopt",
"prettyprint",
"prettyprinttree",
"gdstk",
Expand Down
Loading