Introduction
If you have been writing articles you know the pain to get some attention, if you have already been cross-posting your articles it usually takes some time to do that. This task can be automated with a shellscript. If you have been cross-posting articles on
medium.com
,
dev.to
and at
hashnode.com
, then I have a treat for you.
Introducing crossposter.sh !!
What is Crossposter.sh?
Crosspost to dev.to/hahsnode/medium from the command line.
Crossposter.sh is a shellscript(BASH) to automate crossposting to platforms like dev.to, medium.com and hashnode.com. The script takes in markdown version of your post with a few inputs from you and posts it to those platforms. You would require a token/key for each of those platforms to post it from the command line. You can check out the official repository of Crossposter .
The actual script is still not perfect (has a few bugs). Though it posts on
dev.to
and
medium.com
easily, the
hashnode.com
is buggy as it parses the raw markdown into the post and doesn't render as desired. So,
its a under-development script
, fell free to raise any issues or PRs on the official GitHub repo.
Run the script on a bash interpreter with the command:
bash crosspost.sh
For posting the article you need to provide the following details:
Front-Matter
Meta data about the post
- Title of Post
- Subtitle of Post
-
Publish status of post(
true
orfalse
) - Tags for the post (comma separated values)
- Canonical Url (original url of the post)
- Cover Image (URL of the post's image/thumbnail)
This information is a must for
dev.to
especially the
title
. This should be provide in the same order as given below:
---
title: The title of the post
subtitle: The description of your article
published: true
tags: programming, anythingelse
canonical url: url of your original blog
cover_image: coverimage_url
---
There is no need to enclose any of them with quotation marks.
Published
argument will be
true
if you want to publish it and
false
if you want to keep it in your Drafts.
In the demonstrations, we just need to enter the tokens once. The tokens will be stored locally in the
keys.txt
file and retrieved later within the script.
Posting on dev.to :
Posting on dev.to requires their
API key
which can be generated by going on the
Dev Community API Keys
. From there you can generate a new key with any name you like. You just need to enter the key to CLI once or manually enter in the
keys.txt
file with the format
dev.to:key
on the first line. This will be used for the future cross-posting whenever you execute the shell script(
bash crosspost.sh
)
You can provide the front matter manually in your markdown file or you will be prompted for the input. So, that is all you will require for posting on dev.to from the Command line.
Lets see the script in action
If you want to add in more stuff to the post, you can check out the DEV.to API docs which is powered by Forem , there a ton of options you can hook to the front-matter in the shellscript.
NOTE: There is a limit of 10 requests per 30 seconds, so keep in mind while testing the script and don't try to spam
Posting on hashnode.com :
This part is still under development as it only displays the raw markdown in the post, also the
tags
are too heavy to implement from the API as
id
of every tag is required along with the
slug
and
name
. Still it serves some purpose at least. For posting on
hashnode.com
, we need
Personal Access Token
. This can be generated by going to
Developer Settings
. You will also require the user-id of your
hashnode
account. You can get your user-id/username from the
settings
tab under profile information. We require Username for posting to the Publication Blog if any. As usual, the
Personal Access Token
for interacting with the
Hashnodes' GraphQL API
. The API is quite user friendly and provides everything in one place. There are docs for running each and every
query
and
mutations
present in the API.
You can paste the token when prompted from the script or manually type in the
keys.txt
text file as
hashnode:token
on the 4th line. Yes, that should be on the
4th
line, thats make retrieving much more easier and safe. Next also input in the
username
when the script asks for the input or again type in on the
5th
line,
hashnode_id:username
in the text file
keys.txt
. Please enter the credentials from the script prompt so as to avoid errors and misconfigurations when doing manually
This will create the Post on hashnode with the title, subtitle, cover image correctly but will mess up the content. I tried hard but its just not happening. There needs to be some character for newline as the API rejects the
rn
characters passed in, so I have substited them with
br
and the result is raw markdown.
As the Hashnode API is still under development and they are bringing changes and new features in, the API should improve in its core functionality and make it much easier for creating some common queries
. So, I'll create issue on GitHub for posting the actual content via the script.
So, this is the demonstration of posting on hashnode.
Posting on medium.com :
Medium API is much more versatile and markdown friendly, though it has some limitation on the number of posts you can make in a day. For posting on
Medium.com
, we will require the
Integration Token
which can be generated on the
settings tab
. As similar to
hashnode
, you can name the token whatever you like and then get the token. Paste the token when prompted from the script or manually type in the
keys.txt
text file as
medium:token
on the
2nd
line. We also require the Medium_id, but we can get that from the token itself, so inside the script once the token is obtained, the curl command is executed to fetch in the
id
and it is stored on the next(
3rd
) line in the
keys.txt
file for actually posting on
medium.com
. So that is all the configuration you need for posting on
medium.com
.
There is some documentation on Medium API , we can even post to a Publication, that shall be created in future. Also the cover images can be posted on medium, it is not currently done but that can again be a #TODO. The tags are not rendered on Medium yet with the script. The way we can parse strings is limited in BASH, so this might still be a doable thing later. Most of the checkboxes are ticked like title, subtitle, cover-image, canonical url, and importantly the content.
Let's look at post on medium from the script.
All platforms:
Now, once you have configured every thing, you can opt for the
4
choice that is post on all platforms(dev.to, hashnode and medium), but as hashnode is not looking a good option right now, so there is the
5
option for only
dev.to
and
medium
.
Why use Crossposter.sh?
This can be not so big of an issue for most of the people but it was a good side project to work and learn more about how APIs work and get some ideas on the design of the platform. Though it is quite time saving to cross post on 3 different platforms within a minute or two. You can tailor your own script as per your specifications and desire.
So, if you are an author on all of the mentioned platforms, please give it a try. Other Platforms are welcome for contributions. If you found any unexpected things, please hit them in the
issues
tab.
Script
The script mostly leverages
curl
,
sed
,
cat
and some other basic utilities in BASH.
Using
curl
for posting the article from APIs
Curl is a life saver command for this project, without this tool, the project might not be as great and efficient. Let's see some quick commands used in the script.
curl -H "Content-Type: application/json" -H "api-key": \"'"$key"'\" -d '{"content":\"'"$body"'\"}' "$url"
So, the above command is quite basic, some more additions are also added as per the specifications of the Platform. But, let us understand the structure of the command we are sending to the APIs. The first part is the Header (
-H
), in here we specify the content that is going to get parsed and the api-keys to access the API. Next, we have the body or the data (
-d
), here we parse in the actual contents, it might be the front matter along with the markdown content. Finally we have the
url
where we send the
POST
request i.e. the
API endpoint
. The is the escape character that is used to preserve the literal value of the next character and in short we can shorten the command to fit in the next line.
The wired
$body
is used to parse the value of the variable
body
inside of single quotes as in BASH, we can only access the variables' value in double quotes. We are using single quotes as we have to pass the
json
object and which has already double quotes in it.
Using
sed
for editing text
Sed is a super-powerful stream editor, its somewhat similar to Vim without an interface, only commands. We use this tool to manipulate the front-matter for posting on the platforms by parsing them to variables in BASH. We also use to enter the api keys inputted by user from variables into the file at a specific position to retrieve later.
sed -i "1a title: $title" file.txt
Here, we are appending(
a
) to the 1st line, text
title: $title
, here
$title
is the variable, so we are technically parsing the value of the variable
title
. We are editing the file
file.txt
in-place
-i
i.e. we are editing it live without creating any temp or backup files.
sed -n -e "s/dev.to://p' keys.txt"
Here we are essentially getting the text after a particular pattern. In this case we are searching in
keys.txt
file for the string
dev.to:
and anything after that till the end of line is returned, we can further store it in the variable and do all sorts of operation.
Using
awk
for programmatic editing
awk '{print $0"\r\n"}' temp.txt >file.txt
AWK is a command-line utility for manipulating or writing certain operations/patterns programmatically. We use this tool so as to add
4r4n
to the end of each line, the APIs can't parse the file contents directly so we have to add certain characters before the end of line and do further operations.
cat temp.md | tr -d '\r\n' > temp.txt
After we have added the
\r\n
characters to the end of the file, we simply can use
cat
and
tr
to merge all the lines into a single line. This is how we parse the contents to the API more safely and concisely, of course we need to parse them in a variable by reading the file.
OK, I won't bore anyone with more BASH but that were some of the quite important commands in the script that form the backbone of the cross-posting and handling text with the APIs.
Conclusion
So, we can see
crosspost.sh
is a BASH script that cross-posts markdown articles with a bit of inputs to 3 different platforms within a couple of minutes. This article was basically to demonstrate the project and its capabilities also highlighting the issues present. I hope you liked the project, please do try it and comment the feedback please. Thank you for reading, Until next time, Happy Coding :)