How We Review Blocks and Plugins

Last updated:

Short version: we only publish hands‑on reviews. We download the product, install it on a WordPress site, put it through real‑world tasks, and record what works, what doesn’t, and whether it’s worth your time and money.

Our review principles

  1. Hands‑on, not hand‑wavy. Every review starts with a fresh install and real usage. No “impressions” from marketing pages.
  2. Real‑world first. We build something practical (a demo site or a small feature for a real project) to see how the product behaves beyond the homepage.
  3. Replicable tests. We document the versions, environment, and steps we took so other developers can reproduce results.
  4. Independent & transparent. We don’t sell positive coverage. If we use affiliate links or receive a license from a vendor, we say so—our scores aren’t for sale.
  5. Actionable takeaways. You’ll always get a clear verdict, pros/cons, and who the product is (and isn’t) for.

Our test environment

We test in two setups:

  • Sandbox: A clean WordPress site that mirrors a our typical production stack.
  • Real‑world: When relevant, we trial the product in a real workflow to verify reliability over longer periods. Since we have been building WordPress sites for the past 10 years – we have often used the plugin in client projects.

Baseline setup we document in each review:

  • WordPress version, PHP version
  • Plugin version(s) tested
  • Any special configuration relevant to the product

How we test

1) Install & baseline

  • Fresh install, verify versions, snapshot the environment.
  • Note any required dependencies or account connections.

2) Build something real

  • Recreate a common task the product claims to solve (e.g., SEO setup, forms, filtering, speed optimizations, design layout).
  • Record time‑to‑first‑success and any blockers or workarounds.

3) Stress & edge cases

  • Try unusual inputs, toggle key settings, switch themes, and test compatibility with a few popular plugins.
  • Update the product to the latest release to catch any regression or migration issues.

4) Performance spot checks

  • Observe editor responsiveness and page load impact.
  • Check request count/asset size changes and any obvious database bloat.

5) Support & docs quick‑scan

  • Search the docs for common tasks and errors; note clarity and depth.
  • When appropriate, submit a support query to evaluate responsiveness/accuracy.

6) Scoring & verdict

  • Fill in our rubric, compare to peers, and try to write plain‑English recommendations: who should use it, and who shouldn’t.

What we always disclose in a review

  • Any vendor involvement: free license provided, demo access, or early builds.
  • Affiliate links: if we may earn a commission at no extra cost to you.
  • Conflicts of interest: if we’ve worked with the vendor or a direct competitor.

Updates & re‑reviews

  • We monitor major releases and update reviews when changes affect our verdict (e.g., new features, performance improvements, pricing changes).
  • If a significant update lands, we add an “Update” note at the top with the date and what changed.
  • If a product backtracks on promises or introduces regressions, we lower the score and explain why.

Our rating scale (1–10)

  • 9–10 – Excellent: Best‑in‑class for its category. Few compromises; easy recommendation.
  • 7–8 – Very good: Strong choice with minor drawbacks or a few missing features.
  • 5–6 – Good: Solid for specific needs; some trade‑offs to be aware of.
  • 3–4 – Limited: Works, but with notable issues or better options available.
  • 1–2 – Not recommended: Major gaps, reliability problems, or poor value.

How we choose alternatives

When we suggest alternatives, we pick products that:

  • Target the same job‑to‑be‑done
  • Offer a meaningfully different trade‑off (price, performance, features)
  • Have active development and a history of updates

Corrections & feedback

We welcome vendor and reader feedback. If we get something wrong—or if the product changes—contact us and we’ll verify and update the review with a dated correction note.

Want us to review your product?

We’re happy to test new plugins, themes, and tools. Fill in this short form:

We don’t guarantee inclusion or a positive score, but we do promise a fair, hands‑on evaluation.

FAQ

Do you accept paid reviews?

No. We don’t sell positive coverage.

Do affiliate links affect scores?

No; see our disclosure.

Will you re-review after major updates?

Yes; we add a dated update note.

Can vendors preview drafts?

No, but we try to correct factual errors quickly.

How long does a review take?

Typically 1–2 months, depending on complexity and our current queue.

Last updated: