Skip to content

devmehq/open-graph-extractor

Folders and files

NameName
Last commit message
Last commit date
Jan 18, 2023
Jan 18, 2023
Jan 18, 2023
May 25, 2022
Dec 28, 2021
Dec 28, 2021
Dec 28, 2021
Dec 28, 2021
Jan 18, 2023
Dec 28, 2021
Dec 28, 2021
Dec 28, 2021
Dec 28, 2021
May 25, 2022
Dec 28, 2021
Jan 30, 2022
Dec 28, 2021
Jan 18, 2023
Jan 18, 2023
Dec 28, 2021
Jan 18, 2023
Jul 19, 2023

Repository files navigation

Open Graph Extractor

Build Status NPM version Downloads

A simple tools for scraping Open Graph and Twitter Card info off from html.

API / Cloud Hosted Service

We offer this URL Scrapping & Metadata Service in our Scalable Cloud API Service Offering - You could try it here URL Scrapping & Metadata Service

Self-hosting - installation and usage instructions

Installation

Install the module through YARN:

yarn add @devmehq/open-graph-extractor

Or NPM

npm install @devmehq/open-graph-extractor

Examples

// use your favorite request library, in this example i will use axios to get the html
import axios from "axios";
import { extractOpenGraph } from '@devmehq/open-graph-extractor';
const { data: html } = axios.get('https://ogp.me')
const openGraph = extractOpenGraph(html);

Results JSON

{
  ogTitle: 'Open Graph protocol',
  ogType: 'website',
  ogUrl: 'https://ogp.me/',
  ogDescription: 'The Open Graph protocol enables any web page to become a rich object in a social graph.',
  ogImage: {
    url: 'http://ogp.me/logo.png',
    width: '300',
    height: '300',
    type: 'image/png'
  }
} 

Configuration options

customMetaTags

Here you can define custom meta tags you want to scrape. Default: [].

allMedia

By default, OGS will only send back the first image/video it finds. Default: false.

onlyGetOpenGraphInfo

Only fetch open graph info and don't fall back on anything else. Default: false.

ogImageFallback

Fetch other images if no open graph ones are found. Default: false.

Testing

yarn test

Contributing

Please feel free to open an issue or create a pull request and fix bugs or add features, All contributions are welcome. Thank you!

LICENSE MIT