Here are some loose thoughts on what I've been tinkering with for the past months or years.
As of lately, I've been working on Folio—the content-management system this site runs on—to add new features, fix bugs here and there, and make it easier to deploy it across multiple websites—including mine and client sites. The system keeps getting better and better and I repeatedly ask myself whether I'm reinventing the wheel in many areas. I make use of great third-party packages developed with care and thoughtfulness by other developers. It's incredible when they work but awful when other developers stop updating and your code breaks. I now pay more and more attention to those GitHub starts (★) and pick carefully what packages to implement. Software rots.
I've learned a lot about managing my own Linux machines—both from scratch or from an existing image—to have a new site ready within minutes (at least, when I don't hit an unknown problem that steals a few hours from my day). I'm mainly deploying apps with Nginx and Laravel but I've also learned to run and deploy Node.js apps (with PM2), how to serve Docker images, or how to run Python and Golang programs.
I'm trying to thoroughly document all the troubleshooting I go through to not have to dig the internet to fix a bug I've fixed some time before. While it's obvious how to fix a bug you encountered yesterday, some bugs don't show up again for long, and you can save hours of work by keeping good notes.
A recent practice I've started playing with is creating automation files. I'm slowly getting acquainted with "Makefiles," text files that describe commands which are a list of command calls that will be executed after you type make command-name
on your Terminal. These commands not only run on Linux machines but on the macOS terminal, so I can run most of my automation scripts both on the desktop and Linux servers. Here's a sample Makefile to setup a Digital Ocean droplet.
I build Folio mainly for myself. There are many systems like it but this one is entirely built by me, and it helps me learn new programming features as well as modern patterns and techniques by browsing other people's code when I use their packages. Many will hate Folio—simply because it runs on PHP—but I believe Laravel is making this programming language great again. (Trust me, I did PhpNuke sites back in 2003 and this is light years ahead.) Laravel feels like an updated version of Ruby on Rails.
I'm migrating most of my sites to Digital Ocean. Their droplet system (without hidden costs) is great. I'm still to see where to put Getting Simple's podcast audio files. A powerful package by the Spatie team makes backing up websites a breeze: I can schedule automatic database and file backups at desired time intervals, even uploading them to multiple disks (such as Digital Ocean Spaces, Dropbox, or Amazon S3).
I've recently started using Imgix to distribute my images and remove that load from website servers. The image-processing REST API they offer makes is flexible and removes many headaches and time lost manually editing images with Photoshop or other applications, might it be to apply simple effects, sharpening, resizing, or even adding watermarks or padding. And their CDN makes their distribution faster.
I rely less and less on TypeKit for web fonts, as I either serve the font files or use Google Fonts. There are also beautiful typefaces from type foundries that I might use soon. (Milieu Grotesque comes to mind.)
A big highlight (that sadly only runs on macOS) is Laravel Valet. After spending months learning how to configure Nginx blocks and crashing the system multiple times I found this simple tool that handles everything for you. There's a bit of reading to do to fully understand what it does but I'd summarize its benefits to two commands: valet link
and valet unlink
. With one command your computer serves a PHP app at http://app-name.test
from the directory in which you run the command and the second command stops serving it. You can quickly register a folder to serve an app (say, with valet link nono
) and quickly go to its URL to test the site locally (at http://nono.test
). Behind the scenes, Valet uses dnsmasq
, php
, and nginx
. Not having to deal with them on a daily basis makes things easy (even though I like to learn what's behind these systems and how to do it the manual way in case there's a need for it).
Another thing I'm loving are Cron jobs. They can be set either on Linux (as I do on Digital Ocean) or macOS with crontab -e
. You also have to learn a bit about how Cron works but the short story is that it lets you schedule tasks: command executions at whatever time interval (or time of the day) you want. For instance, * * * * * curl https://google.com
would ping Google every minute. And you can go from every minute to certain times of the day or fractions of the hour. Laravel copes with it by letting you schedule commands with a clean, high-level API (for instance, ->everyMinute()
or ->dailyAt('17')
). All you do is set a Cron job to execute the Laravel schedule every minute and Laravel decides what commands to run when.
Last but not least, I'd highlight the importance of logging. Most environments have ways to log executions and errors and this is the easiest way to find out what's breaking your system. I've added a route in Folio's administration in order to visualize logs from anywhere and, at a lower level, Nginx lets you log access and errors as well.
I'm constantly learning, and Folio and my website are my playground.
As I started saying, these are loose thoughts of many of the tech I've been exploring over the past year. I'm also learning about Docker, TensorFlow, Runway, and much more, and frequently keeping my notes on Dropbox Paper. Some of my experiments are kept in the open on GitHub, and I've recently started sharing what I'm learning on this YouTube playlist.
What are you tinkering with?
It is a safe bet that the highest-earning professions in the year 2050 will depend on automations and machines that have not been invented yet. —Kevin Kelly, The Inevitable
Last month, Jose Luis García del Castillo y López (@garciadelcast) and myself (@nonoesp) had the opportunity to lead the Mind Ex Machina cluster at SmartGeometry1 2018. (Watch on YouTube.)
This talk summarizes the projects that came out of our workshop, which intended to explore the possibilities of robot-human-ai interactions with the use of machine learning libraries and the Machina2 robotic control framework.
The SmartGeometry workshops and conferences were hosted in May 7–12, 2018, at the John H. Daniels Faculty of Architecture, Landscape, and Design at University of Toronto, Canada. The Mind Ex Machina cluster worked most of the time at the Autodesk Toronto Technology Office, located in the MaRS Discovery District.
I'm extremely thankful to Marc Webb for the following video, which provides a bit more insight on the things we worked on. (Watch on Vimeo.)
Shout-out to the impressive work of other clusters such as Fresh Eyes and Data Mining the City. See all of the videos here.
I think the whole group had a blast working on these projects—thanks! You can find notes and source code of the projects on GitHub (especially, in this repository).
🧠x🤖
SmartGeometry is a bi-annual workshop and conference, this year entitled sg2018: Machine Minds, at the University of Toronto, Canada, from 7th-12th May 2018. The sg2018 workshop and conference is a gathering of the global community of innovators and pioneers in the fields of architecture, design and engineering. ↩
Machina is an open-source project for action-based real-time control of mechanical actuators or, in more human terms, it allows you to talk to a robot and tell it what to do. Machina is developed and maintained by Jose Luis García del Castillo y López. ↩
As we continue adopting new digital services and embracing new technologies, our [digital] lives gain more and more complexity. Everything we use gathers data, stores the content produced with it, and, in many cases, learns from the way we behave.
Information, content, reports, and other kind of media conforms a big compound of data, which is often difficult to manage.
How are we expected to handle this situation? How are we supposed to properly manage systems that were not even invented when we were educated? And even worse, how is people expected to teach us about things they have never used? I firmly believe that this will affect our generation, too. We feel like we have the situation under control; just give it a few years until new services, new technologies, and new things are released.
Digital tidiness should be a subject at school. Everyone needs to be able to correctly manage digital clutter.
But there are surely ways to work that may ease the pain. Defined workflows can be implemented in our use of digital systems. The same way we use workflows in our daily tasks outside the digital realm to tackle everyday problems, we can use them in the digital world.
The process you follow when you bring new things into your room: arranging your belongings and your clothes; trashing out old stuff you no longer need, is, for instance, a workflow implemented in your life.
With the same ease digital services allow us to store and manage our data, they become a mess of unmanageable things.
Digital tidiness should be a subject at school. Everyone needs to be able to correctly manage digital clutter, as everyone needs to know how to arrange their clothes, to keep physical documents in place, or to find their belongings when needed.
Every once in a while, I find myself arranging clothes or belongings and giving away things I do not want anymore—archiving and deleting, basically. At times, stuff is just too much to be handled, or there is no time to invest on the task.
The amount of data produced should be limited according to how much can be processed and stored.
The same problem exists in electronic devices and digital storage system. When we produce more content than we process, we are leaving behind a lot of things that would have to be done in the future. For that reason, apart from organizing all our data, we need to be conscious of how much stuff we can handle. The amount of data produced should be limited according to how much can be processed and stored.
One concrete example is taking digital pictures. Thanks to flash-memory-powered DSLR cameras, giga-sized internal capacities of our portable devices, and on-the-go cloud storage services, pictures and videos can be taken daily with no limits. The problem is, these instant task will later take an enormous quantity of digital space and an immeasurable amount of time to process and organize.
Workflows can help reducing the processing and organization time, but awareness of how much is produced is equally or even more important than post-processing. Do you really need to take that picture? Maybe not.
Evgeny Morozov on The Guardian:
In April [2014], Apple patented technology that deploys sensors inside the smartphone to analyze if the car is moving and if the person using the phone is driving; if both conditions are met, it simply blocks the phone's texting feature.
Lisa Rapaport wrote on Reuters how filming teens when they are driving and using apps to block cell phone signals have been proved to reduce "high-risk driving events by 80 percent, compared to teens who didn't have cameras or cell phone signal blockers," while senior research scientist David Kidd says there isn't a lot of research yet that proves the effectiveness of cell phone blocking.
Technology has a role to play, but there is no single solution to the problem of distracted driving. — Ellie Pearson from Brake.
It is unclear up to what degree this measures contribute to less car crashes, but the fact that using the cell phone while driving is increasingly contributing to more accidents seems obvious.
Another interesting fact is that, even though a patent from Apple filed in 2008—later published in 2014—could prevent drivers from texting while driving, and only let them use their phones via voice control [Siri], the iPhone's operative system has restrictions for developers, who, unlike Android developers, haven't been able yet to implement any restrictions for drivers.
After giving some space to Evan Williams and Dick Costolo, Jack Dorsey has returned to Twitter as CEO. He is—as said by The New York Times—the man who sent the first tweet:
As the man who sent the first tweet in 2006 and a product visionary who led the company in its early years, Mr. Dorsey helped make the micromessaging service into a global platform that now has more than 300 million active users, from celebrities like President Obama and Katy Perry whose tweets are followed by millions to ordinary people with just a few dozen followers.
[…]
Mr. Dorsey was the driving force for many of Twitter’s product innovations during that time, like the ability to embed tweets on other sites, but was also a polarizing figure, firing product managers and fostering an atmosphere of secrecy and paranoia, according to current and former associates.
Once again, Apple releases a beautiful video, this time featuring all the possible uses of the iPad. In their own words:
iPad can change the way you do things every day. Take on a new project, pick up a new skill, or start a new hobby. We put together some of our favorite apps and ideas to help you get started.
(via)
The US Federal Aviation Administration (FAA) has recently approved a plan that gives Amazon a green light to test their Amazon Prime Air delivery service — developed to ship packages [under 2,25 kilograms] using drones.
After numerous complaints and warnings about the plans presented by Amazon back in 2013, the FAA established a series of rules and regulations for Amazon to test the viability of their intended services:
Under the provisions of the certificate, all flight operations must be conducted at 400 feet [120 meters] or below during daylight hours in visual meteorological conditions. The UAS must always remain within visual line-of-sight of the pilot and observer. The pilot actually flying the aircraft must have at least a private pilot’s certificate and current medical certification.
The video in the top shows an example of how the delivery service would work — a service that, as Amazon argues, would be extremely useful for shippings to close areas and deliveries to inaccessible locations.
For those who have doubts about this new technology, Amazon provided a Q&A sections on their Amazon Prime Air marketing site:
Q: Is this science fiction or is this real?
A: It looks like science fiction, but it's real. One day, seeing Prime Air vehicles will be as normal as seeing mail trucks on the road.
Q: When will I be able to choose Prime Air as a delivery option?
A: We will deploy when and where we have the regulatory support needed to realize our vision. We’re excited about this technology and one day using it to deliver packages to customers around the world in 30 minutes or less.
Q: How are you going to ensure safety?
A: Safety is our top priority, and our vehicles will be built with multiple redundancies.
Q: What will the Prime Air delivery vehicles look like?
A: It is too soon to tell. We are testing many different vehicle components, designs and configurations.
Q: Where are you building and testing?
A: We have Prime Air development centers in the United States, the United Kingdom and Israel, and we are testing in multiple international locations.
Sources | NYTimes | PopularScience