I usually have a couple cron jobs scheduled, executing backup scripts between EC2 instances and store the data on S3 via Python's Boto library. Everything was running smoothly for several months until one day, the script was failing to authenticate with a 403 error from AWS S3, on one of the EC2 instances...
| CARVIEW |
Select Language
HTTP/2 301
server: nginx/1.10.3 (Ubuntu)
date: Fri, 16 Jan 2026 18:36:26 GMT
content-type: text/html
content-length: 194
location: https://www.thecodeship.com/gnu-linux/
strict-transport-security: max-age=15768000; includeSubdomains; preload
x-frame-options: DENY
x-content-type-options: nosniff
HTTP/2 200
server: nginx/1.10.3 (Ubuntu)
date: Fri, 16 Jan 2026 18:36:27 GMT
content-type: text/html
last-modified: Fri, 19 Jun 2020 06:36:12 GMT
etag: W/"5eec5cdc-3027"
strict-transport-security: max-age=15768000; includeSubdomains; preload
x-frame-options: DENY
x-content-type-options: nosniff
content-encoding: gzip
The Code Ship
Understanding the apt-cache depends Output
I was using apt-cache in Ubuntu to get a list of dependencies for a certain package and parse the output programmatically, eventually I wanted to programatically download and package them within an archive for offline installs later on. I was not really sure about the exact meanings of the output ...
© Copyright 2020 Ayman Farhat. Custom theme built with Pelican. Fork me on Github!
Topics
Subscribe via