LLMs on the command line

11 minute read



Large Language Models (LLMs) are becoming ubiquitous, driving a wide array of products across diverse use cases. This post is about how I’m learning more about their behaviour by integrating them into my workflows on the command line. As a society and as individuals, we’re still in the process of understanding how LLMs can be leveraged to solve real-world problems. I’m going to give a few examples of how I’m using LLMs on the command line to deepen my understanding of their capabilities and explore their practical applications.

I’ll probably keep updating this page with new use cases as I go. This was another blog post I saw in a similar vein as this one.

I’ve been using the Python library LLM1 library to expose LLMs on the command line. My reasoning is that there’s nothing more fundamental than working things through on the command line using fundamental data structures, file formats and linux utility functions.

I’ve used LLM on my personal machine calling out to OpenAI. I use local LLMs at work 2 - proxies, IP, InfoSec - problems only solved when engineering curiosity and business needs align. I’ve used models like Llama 3.1 and Mistral 7b instruct with varying levels of performance 3.

Use cases

Reviewing code

It can be used as a budget GitHub CoPilot for helping review individual files. Here’s an example:

1
2
3
4
~ cat index.html | llm 'where can I improve this file?'
Your HTML file is well-structured and includes important elements for a personal website.
However, there are several areas where you can improve it to enhance accessibility,
SEO, and overall user experience. Here are some suggestions:

Sometimes I’ll run a file through to check if I’ve misconfigured something prior to setting off a long running process. At this point I’m essentially doing prompt engineering on the previous command.

1
2
3
4
5
6
7
8
9
10
~ cat .github/workflows/deploy-site.yml \
  | llm 'can you spot any inaccuracies or mistakes in this file?'
Your GitHub Actions workflow file looks mostly good,
but here are some points to consider or correct:

1. **Branch Naming**: Make sure that the branches `"gh-actions"`
and `"master"` match exactly with your repository branches. If
your default branch is something other than `"master"` (like `"main"`)
, you should update it accordingly.
...

Also see this article on using ripgrep and llm for understanding how a codebase works.

Summarising files

It’s nice to be able to summarise all the changes made in a given branch. It’s possible to be caught out by the context window of the LLM, so you can either write small PRs or chunk the PR up4.

1
2
3
4
5
6
7
8
~ git --no-pager diff master | llm 'concisely summarise this diff'
The diff shows modifications to a blog post titled "LLMs on the command line."
The changes include:

1. **Title and Description**: The title is simplified to "LLMs on the command line,"
and the description is updated to reflect the use of the Python LLM library to
automate developer activities.
...

Writing the descriptions for my blog posts.

1
2
cat _posts/2023-04-23-personal-blog-deployment.md \
  | llm 'summarise this blog post factually in two sentences in the first person'

Parsing web pages

Summarising content from web pages is be useful for a range of purposes. In this example I’m checking the forecast for London, extracting references to the term forecast and asking for a summary of the weather forecast. This could be useful for screen readers or potentially aggregating with other forecasts.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
 curl https://www.bbc.co.uk/weather/2643743 | grep 'forecast' | llm 'what i
s the weather forecast?'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  633k  100  633k    0     0   540k      0  0:00:01  0:00:01 --:--:--  540k
The weather forecast for London indicates the following:

**Tonight (August 10, 2024):**
- **Weather:** A clear sky with light winds.
- **Temperature:** A low of 15°C (59°F).
- **Wind:** Light winds from the west at about 9 mph (15 km/h).
- **Humidity:** 70%.
- **Precipitation:** No precipitation expected.

**Tomorrow (August 11, 2024):**
- **Weather:** Sunny intervals with a gentle breeze.
- **High Temperature:** 27°C (81°F).
- **Low Temperature:** 17°C (63°F).
- **Wind:** Light easterly winds at about 11 mph (18 km/h).
- **Humidity:** 62%.
- **Precipitation:** No precipitation expected.

**Outlook for August 12-14, 2024:**
- **August 12:** A sunny day with temperatures reaching up to 33°C (91°F) in the afternoon.
- **August 13:** Similar conditions with temperatures around 27°C (80°F) and mostly sunny.
- **August 14:** A warm day with light cloud cover but remaining dry.

Overall, the coming days are expected to be warm with mostly clear skies and occasional sunny intervals.

This blog post gives a few examples of how LLMs, when combined with parsers are more effectively summarising a web page.

Here’s when I’d like a summary of what’s going on with the tube.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
~ curl https://tfl.gov.uk/tube-dlr-overground/status/ | llm 'what is the ser
vice like on the tube?'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  214k    0  214k    0     0   219k      0 --:--:-- --:--:-- --:--:--  219k
The current service status on the London Tube, Overground, Elizabeth Line, DLR,
and Tram services includes several disruptions. Here are some key updates based
on the provided information:

1. **DLR (Docklands Light Railway)**: There is a part closure this weekend
(10th - 11th August) with no service between Poplar/Stratford International and
Beckton/Woolwich Arsenal. Replacement buses are operating on certain routes.

2. **Jubilee Line**: Minor delays are reported due to train cancellations.

3. **London Overground**: Significant disruptions with no service between Highbury
& Islington and Clapham Junction/Crystal Palace due to closures on the 10th and
11th August. Replacement buses are being provided.

4. **Piccadilly Line**: Severe delays between Acton Town and Uxbridge because of
train cancellations, although there is a good service on other parts of the line.

5. **Victoria Line**: Minor delays due to train cancellations.

6. **Waterloo & City Line**: Service operates only Monday to Friday. No service on
weekends and public holidays.

7. Generally, many lines report a "Good Service," but specific lines such as the
Bakerloo, Central, Circle, District, Hammersmith & City, Metropolitan, and Northern
have no current disruptions.

8. Individual stations may also have various closures or accessibility issues,
particularly affecting step-free access at several locations.

For the most accurate and real-time updates, always check the official Transport
for London (TfL) website or app before traveling.

Analysing running processes

This worked surprisingly well. GPT-4o can understand what the output of top looks like and then answer the question correctly.

1
2
3
4
5
$ top -b -n 1 | llm "what is the most intensive process?"
Based on the `top` output you provided, the most CPU-intensive process
at the moment is:

- **PID 1564 (python3)**: It is using **13.3% CPU**.

Answering and explaining cryptic crossword clues

Sometimes it can be handy to explain why clues work they way they do. Some models are better than others. anecdotally I’ve found that Microsoft CoPilot is better than ChatGPT.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
~ llm "Please explain why the cryptic crossword clue 'City celebrity interrupted by steel band at first (6)' is lisbon"
To solve the cryptic crossword clue "City celebrity interrupted by steel band at first (6)", we can break it down:

1. **City**: This suggests that the answer is the name of a city.

2. **celebrity**: In crossword clues, "celebrity" can often refer to a star or someone famous. A common abbreviation for "celebrity" is "C", as in "celeb."

3. **interrupted by**: This indicates that the word or letters are going to be placed inside another word.

4. **steel band at first**: Here, "steel band" could refer to a material "iron," which is often symbolized by the letter "I" (the chemical symbol for iron). "at first" usually means we take the first letter of the next word, which in this case is "band." The first letter of "band" is "B".

Now we can put these clues together:

- The word "celebrity" gives us the abbreviation "C."
- "steel" gives us "I" from iron.
- "band" gives us the first letter "B."

Putting "C" (celebrity) between "I" (for iron) and "B" (for band), we get:

I **C** B, but that doesn’t directly form "Lisbon."

However, the clue mentions "interrupted" indicating "C" is interrupted, but we seem to have missed how "Lisbon" properly connects here, as it doesn't obviously fit given the structure we derived.

Instead, "Lisbon" can work because it is a 6-letter city (the first part of the clue) that has a phonetic link to being "interrupted by" something related to a band, and while it isn’t as direct, cryptic clues do often require a bit of lateral thinking or suspension of strict logical structure for wordplay.

So, to summarize:

- The 6-letter city is "Lisbon."
- The "C" from "celebrity" being placed in "IS" (perhaps shorthand for a city) isn’t entirely clear without other context—but implies phonetics or rearrangement could allude to intelligent wordplay.

Ultimately, the answer to the clue is "Lisbon", and while the exact derivation needs a more creative leap, it fits both the necessary letter count and context of being a city, serving as an example of the layered thinking common in cryptic crosswords.
  1. Simon Willison wrote these tools and has a great blog. Check it out! 

  2. If you’re using local models they’re likely to have lower limits on the amount of tokens they can process. ttok is a helpful tool for limiting the number of tokens that are piped to llms. 

  3. Although I’m sure those models will change in the next weeks as the field advances! 

  4. As an aside, my new most favourite git command is git add -p which let’s you interactively stage git hunks. This helps me double check that I’ve not left embarassing debug statements in.