One of my daily morning routines is to scan several news sites and read headlines to see if anything piques my interest. Or at least to have a vague recollection of topics happening nowadays.
I go to Hacker News, The Verge, The Guardian⦠and others. A mixed bag, really of tech and world news.
Because of the mind of an engineerβ¦ I wanted to optimize the time spent doing this. How? Glad you asked π Using the magic of RSS π₯° I love RSS, Google tried to end RSS with the killing of Google Reader but it is still going strong. I use it to follow the blogs I like and now, the news I want to consume.
π‘ Idea
Have a simply HTML page of containing news headlines with links to the articles from different sites (RSS feeds). For instance two of the feeds I use:
Scraping the sites would be another approach but it would complicate things so much more π If they ever change classes or markup, your selectors might stop working. With RSS you know what you're getting. As long as they keep updating the content on their feeds, we're good! π
π ββοΈ I don't want a backend
I don't want any backend, database or API service. Just give me the stuff on a plain HTML page. Fast.
To do that we need to use a CI to build this HTML, so we don't have a server to pull on demand. Rather, why don't we schedule the build of this HTML before I wake up?
π€ NodeJS and GitHub Actions
We will use NodeJS because I know JavaScript but you could use another language. As long as you can fetch the RSS feeds and parse them, you're good.
With NodeJS I used this cool NPM package:
npm install --save rss-parser
Code language: Bash (bash)
Then something similar to this:
let Parser = require('rss-parser');
let parser = new Parser();
(async () => {
let feed = await parser.parseURL('url-of-the-feed.xml');
console.log(feed.title);
feed.items.forEach(item => {
console.log(item.title + ':' + item.link)
});
})();
Code language: JavaScript (javascript)
In my case I will loop through my RSS feed URLs which I have stored in a JSON file so I can easily add/remove items without changing the actual implementation.
I'm building an array of promises and using Promise.all
I'll parse all the contents and build up an HTML page. A list of links, really.
GitHub Actions
I'm in love with the simplicity of GitHub Actions, so easy to implement, so little friction, and yet so powerful.
I'm going to create an action to run my NodeJS script, take the generated HTML and deploy it to GitHub Pages. Which will simply host HTML and CSS.
In my package.json
I have a single command build
:
{
"name": "news-digest",
"version": "1.0.0",
"description": "A single place to check all my daily news sites in 1 go",
"dependencies": {
"rss-parser": "^3.12.0"
},
"scripts": {
"build": "node index.js"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Code language: JSON / JSON with Comments (json)
Then on the Github Action workflow:
name: Build and Deploy
on:
push:
branches:
- main
schedule:
- cron: "0 6 * * *"
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout ποΈ
uses: actions/checkout@v2
with:
persist-credentials: false
- name: Install and Build
run: |
npm ci
npm run build
- name: Deploy π
uses: JamesIves/github-pages-deploy-action@releases/v4
with:
BRANCH: gh-pages # The branch the action should deploy to.
FOLDER: dist # The folder the action should deploy.
Code language: YAML (yaml)
Don't forget to setup the Github Pages on the repository settings. You have to set it to the gh-pages
branch.
π See it in action
Here's the live site which I check every morning βοΈ π₯± Have also a look on GitHub
Would you do this differently? If so, how?

Mentions