Skip to content

Replace Selenium and Playwright with Natural Language

Every team has one. A folder full of Selenium or Playwright scripts that nobody touches because:

  • They break every time the UI changes
  • They require an engineer to write and maintain
  • They need a local dev environment, Node.js, Python, package managers
  • They use selectors like #root > div:nth-child(3) > form > button.submit-btn that nobody can read
  • When they fail, the error message tells you what broke, not why

Product managers know what needs testing. They wrote the acceptance criteria. But there’s a wall between “user can complete checkout as a guest” and a 200-line Playwright test that implements it.

Gasoline removes that wall.

A text file. That’s it.

Instead of this:

// 47 lines of Playwright that break next sprint
const { test, expect } = require('@playwright/test');
test('guest checkout', async ({ page }) => {
await page.goto('https://shop.example.com');
await page.click('[data-testid="search-input"]');
await page.fill('[data-testid="search-input"]', 'wireless headphones');
await page.press('[data-testid="search-input"]', 'Enter');
await page.click('.product-card:first-child .product-link');
await page.click('#add-to-cart-button');
await page.click('.cart-icon-wrapper > button');
await expect(page.locator('.cart-count')).toHaveText('1');
await page.click('[data-testid="checkout-btn"]');
await page.fill('#shipping-name', 'Jane Doe');
await page.fill('#shipping-address', '123 Main St');
// ... 30 more lines
});

You write this:

Test: Guest Checkout
1. Go to the shop homepage
2. Search for "wireless headphones"
3. Click the first product in the results
4. Click "Add to Cart"
5. Open the cart
6. Verify the cart shows 1 item
7. Click "Checkout"
8. Fill in the shipping form: Jane Doe, 123 Main St, Portland, OR, 97201
9. Click "Continue to Payment"
10. Verify no errors on the page
11. Verify the URL contains "/checkout/payment"

Hand that to your AI (Claude Code, Cursor, Windsurf — any MCP-compatible tool) with Gasoline connected. The AI reads it, drives the browser, and reports the results.

No dev environment. You need a browser with the Gasoline extension and an AI tool. No npm install, no pip install, no package.json.

No selectors. You write “Click the Add to Cart button.” The AI figures out the selector. Gasoline supports semantic selectors — text=Add to Cart, label=Email, aria-label=Close — that target what elements mean. The AI picks the right one.

No maintenance. When the design team renames a button from “Checkout” to “Proceed to Checkout,” your test still works — the AI adapts. If the text changes so much that the AI can’t find the element, it tells you in plain English: “I couldn’t find a button matching ‘Checkout’. I see ‘Proceed to Checkout’ instead. Should I use that?”

No debugging. When a traditional test fails, you get TimeoutError: locator.click: Timeout 30000ms exceeded. When a natural language test fails, the AI tells you: “Step 7 failed — when I clicked Checkout, the page showed an error: ‘Your cart is empty.’ It looks like the Add to Cart action didn’t persist after navigation.”

Step 1: Open your AI tool with Gasoline configured.

Step 2: Paste your test script and ask the AI to run it:

Run this acceptance test against our staging environment and report results:
Test: Guest Checkout
1. Go to https://staging.shop.example.com
...

Step 3: Watch the browser. The AI navigates, clicks, types, and verifies — you see every action happen live.

Step 4: Read the report. The AI tells you which steps passed, which failed, and why.

Save your scripts in version control. They’re plain text files — they diff cleanly, they’re reviewable in PRs, and they live next to the code.

Use state checkpoints. Before running a test, save the browser state:

Before running: save the current browser state as "pre-test"
After the test: report results and restore "pre-test" state

The AI calls save_state and load_state behind the scenes.

Add verification steps. Don’t just check that clicks work — verify outcomes:

7. Click "Checkout"
8. Verify no console errors appeared
9. Verify the API call to /api/checkout returned 200
10. Verify the page shows the order confirmation number

The AI uses observe({what: "errors"}) and observe({what: "network_bodies"}) to verify these — deeper than any Selenium assertion.

Selenium/PlaywrightGasoline + Natural Language
Who writes testsEngineersAnyone who can describe the workflow
LanguageJavaScript, Python, JavaEnglish
MaintenanceConstant (selectors break)Rare (semantic selectors adapt)
SetupNode/Python + browser driver + CI configBrowser extension + AI tool
Error messagesTimeoutError: selector not found”The submit button isn’t visible because the form has a validation error on the email field”
Network verificationRequires intercept setupobserve({what: "network_bodies"}) built in
WebSocket testingNot supportedobserve({what: "websocket_events"}) built in
Visual verificationScreenshot comparison librariesAI sees the screenshot and reasons about it
Accessibility checksSeparate library (axe)analyze({what: "accessibility"}) built in
  1. Install Gasoline (Getting Started)
  2. Enable AI Web Pilot in the extension popup
  3. Write your first test — start with something you already test manually
  4. Paste it into your AI tool and say “run this test”
  5. Watch it work

You’ve been writing acceptance criteria in Jira your whole career. Now those criteria are the tests.