Absortio

Email → Summary → Bookmark → Email

summarize

http://summarize.sh/ Jan 26, 2026 21:56

Extracto

Fast summaries, in the CLI and the Chrome Side Panel.

Resumen

Resumen Principal

"Summarize" es una herramienta innovadora y altamente versátil diseñada para transformar enlaces, archivos y diversos formatos multimedia en resúmenes concisos y precisos. Su núcleo reside en una pipeline de extracción avanzada que asegura la calidad y la relevancia del contenido procesado. La solución se manifiesta en dos interfaces clave: una **Interfaz de Línea de Comandos

Contenido

summarize

CLI + Chrome Extension

Summaries that live where you work.

Summarize turns links, files, and media into sharp summaries with a real extraction pipeline. Use the CLI for automation or the Chrome Side Panel for one-click summaries of the current tab. Supports local, paid, and free models.

Quickstart

npm i -g @steipete/summarize summarize "https://example.com/article"

CLI

Fast summaries, scripted or interactive.

Built for automation: extract clean text, summarize with your model, and output JSON or Markdown. Works with URLs, PDFs, images, audio/video, YouTube, and podcasts.

  • Extract + summarize with Firecrawl fallback.
  • Media pipeline with transcript-first flow and Whisper fallback.
  • Scriptable output via --json, --extract, --metrics.
summarize "https://example.com" --length long
summarize "https://youtu.be/..." --youtube auto
summarize "/path/report.pdf" --model google/gemini-3-flash-preview

Chrome Extension

Summaries in the Side Panel, one click away.

A real Chrome Side Panel with a tiny local daemon. It streams Markdown summaries for the active tab, with auto-summary on navigation.

  1. Install the CLI + daemon.
  2. Load the unpacked extension.
  3. Open Side Panel and connect with the token.

Runs locally on your machine. The daemon is localhost-only and token-protected.

Extraction

HTML -> clean text -> summary

Readability, markitdown, and Firecrawl fallback when sites fight back.

Media

Podcast + YouTube aware

Prefers published transcripts, then Whisper when needed.

Models

Provider-agnostic

Local OpenAI-compatible gateways, paid providers, and OpenRouter free models.

Outputs

Readable + scriptable

Streaming TTY output, ANSI Markdown, JSON diagnostics, and metrics.

How it works

01

Fetch + extract

Pull the source, clean HTML, normalize, or convert to Markdown.

02

Transcript when needed

Use published transcripts, then Whisper fallback for media.

03

Summarize + format

LLM output, streaming in the CLI or Side Panel, with metrics.

More