Nono.MA

DECEMBER 21, 2021

Say you have a long list of numbers and you want to add them together, or maybe you wish to multiply them or do some other series of operations, to all of them. You can use a reducer, and here's a sample of how to define a reducer as a simply add-two function to add an entire list of numbers.

const numbers = [20, 30, 50];
const reducer = (previousValue, currentValue) => {
  return previousValue + currentValue;
}
const result = numbers.reduce(reducer); // 100

You could easily parse a plain-text list as well.

const text = '1.75 Milk\n0.70 Bread\n2.56 Yogurt';
const numbers = text.split('\n').map(s => parseFloat(s.split(' ')[0]));
const reducer = (previousValue, currentValue) => {
  return previousValue + currentValue;
}
const result = numbers.reduce(reducer); // 5.01

DECEMBER 20, 2021

I came across this error when trying to pass a JSON object from Laravel to a Vue component.

[Vue warn]: Invalid prop: type check failed for prop "myProperty". Expected Array, got String with value...

You may have passed a stringified JSON object which Vue is not parsing as an Array.

In my case, the error was I wasn't binding a property but passing a raw String that was meant to be evaluated by Vue as a JavaScript expression.

I was defining properties in Laravel PHP.

$collection = Post::all();
$numbers = [1,2,3];

Then passing them with the Blade templating syntax to my Vue component.

<issue-cover
  v-bind:collection="{!! $collection->toJson() !!}"
  my-numbers="{!! $numbers !!}"
><\/issue-cover>

As you can see, my-numbers is not using the v-bind:property-name or :property-name (a short-hand to the former) syntax. That was all I had to do to solve this issue on my end.

<issue-cover
    v-bind:collection="{!! $collection->toJson() !!}"
    v-bind:my-numbers="{!! $numbers !!}"
><\/issue-cover>

DECEMBER 2, 2021

SageMaker can be quite confusing. Here are some notes I took while learning how the model and output parameters work.

  • model_dir is provided as an Estimator function parameter.
  • output_dir and output_data_dir are provided as Estimator hyperparameters.

(See how to provide these arguments in code below.)

After a successful run, whatever is saved to each of these directories will be uploaded to a specific S3 location within your job's folder and bucket.

  • model.tar.gz will contain files saved to /opt/ml/model
  • output.tar.gz will contain files saved to /opt/ml/output and (inside of the data subfolder) the files saved to /opt/ml/output/data

Here's the sample directory tree with a train.py entry point that saves a text file to each of these locations.

# Files saved to /opt/ml/model/
model.tar.gz
    model.txt

# Files saved to /opt/ml/output/
output.tar.gz
    output.txt
    success
    # Files saved to /opt/ml/output/data/
    data/
        output_data.txt

# Files in the Estimator's source_dir
source/
    sourcedir.tar.gz
        # All files in your source_dir

Here's how you'd override these locations in your Estimator.

# Create a TensorFlow Estimator
estimator = sagemaker.tensorflow.estimator.TensorFlow(
    ...
    model_dir='/opt/ml/model',
    hyperparameters={
        'output_data_dir': '/opt/ml/output/data/',
        'output_dir': '/opt/ml/output/',
    },
    ...
)

And here's how you'd read their values inside of your entry point, e.g., train.py. Note that, even if you don't pass these three variables to your Estimator and its hyperparameters, you can capture them in your entry point script by defaulting to the SageMaker environment variables, namely, SM_MODEL_DIR, SM_OUTPUT_DIR, and SM_OUTPUT_DATA_DIR, which default to /opt/ml/model, /opt/ml/output, and /opt/ml/output/data.

import argparse
import os

if __name__ == '__main__':
    parser = argparse.ArgumentParser()
    parser.add_argument("--model_dir", type=str,
                                    default=os.environ.get('SM_MODEL_DIR'),
                                       help="Directory to save model files.")
    parser.add_argument("--output_dir", type=str,
                                    default=os.environ.get('SM_OUTPUT_DIR'),
                                       help="Directory to save output artifacts.")
    parser.add_argument("--output_data_dir", type=str,
                                    default=os.environ.get('SM_OUTPUT_DATA_DIR'),
                                       help="Directory to save output data artifacts.")

    opt = parser.parse_args()
    print(f'model_dir › {opt.model_dir}')
    print(f'output_dir › {opt.output_dir}')
    print(f'output_data_dir › {opt.output_data_dir}')

Testing this functionality, I saved a text file to each of these locations to see what the SageMaker SDK was uploading to S3. (The resulting directory structure can be seen above.)

# Save a text file to model_dir
f = open(os.path.join(opt.model_dir, 'model.txt'), 'w')
f.write('Contents of model.txt!')
f.close()

# Save a text file to output_dir
f = open(os.path.join(opt.output_dir, 'output.txt'), 'w')
f.write('Contents of output.txt!')
f.close()

# Save a text file to output_data_dir
f = open(os.path.join(opt.output_data_dir, 'output_data.txt'), 'w')
f.write('Contents of output_data.txt!')
f.close()

What's the difference between output_dir and output_data_dir?

SageMaker provides two different folders, the parent output folder and the output data subfolder. According to official AWS GitHub samples, output_dir is the directory where training success/failure indications will be written—which is an empty file named either success or failure—, output_data_dir is reserved to save non-model artifacts, such as diagrams, TensorBoard logs, or any other artifacts you want to generate during the training process.


I hope the read was helpful!

NOVEMBER 18, 2021

I found this approach to install unzip on SageMaker Studio with yum in the Terminal.

sudo yum install -y unzip

Now you can use unzip --version to verify unzip is installed, and unzip file.zip to extract the contents of a compressed file.


An alternative approach which didn't work for me

Run the following command (note the bang!) inside of a Jupyter notebook or Python console in SageMaker Studio to install unzip.

!conda install -y -c conda-forge unzip

After running that command, I don't seem to be able to run unzip in Terminal yet.

NOVEMBER 5, 2021

The Visual Studio Code (VSCode) development team has been working hard to ship vscode.dev, a version of Visual Studio Code that runs entirely in your browser.

OCTOBER 18, 2021

I found the following error while trying to execute PORT=4444 node ./bin/server.js on my Node.js application.

SyntaxError: Cannot use import statement outside a module

I solved it by adding the following into the package.json file of my NPM project.

  "type": "module",

SEPTEMBER 29, 2021

Nesting <a> HTML elements is forbidden. Here's an example:

<a href="https://nono.ma">
  Go to my website, or to my
  <a href="https://nono.ma/about">
    about page
  </a>.
</a>

A link to the about page is nested inside of a link to the root of this site.

SEPTEMBER 27, 2021

From MongoDB's JSON Schema Examples Tutorial page:

JSON Schema is an IETF standard providing a format for what JSON data is required for a given application and how to interact with it. Applying such standards for a JSON document lets you enforce consistency and data validity across similar JSON data.

The purpose of a JSON Schema is to define the allowed property names and data types allowed to facilitate the validation of a given JSON object. This reminds me of TypeScript definitions, where type-checking happens by default after you've instantiated a given interface or class.

SEPTEMBER 21, 2021

If you're trying to remove a directory using the os.rmdir function, but it contains other files, you'll probably hit the following error.

OSError: [Errno 66] Directory not empty:

You can ignore this error by using the shutil library instead of os.

import shutil
shutil.rmtree(path)

Note that Python won't prompt you to confirm this deletion action and this may lead to deleting files by mistake.

SEPTEMBER 16, 2021

When trying to stitch several videos together with FFmpeg with the following command.

ffmpeg -f concat -i list.txt -c:v copy concat.mp4

I came across this error.

[concat @ 0x7fca4281b800] Unsafe file name '2021-09-16 08.13.mov'
list.txt: Operation not permitted

The issue, which I've been able to fix manually other times, is that there's an unsafe character one or more input video names, the space.

As it turns out, we only need to turn off this safety measure for FFmpeg to skip this check, passing the -safe 0 flag in our command.

ffmpeg -f concat -safe 0 -i list.txt -c:v copy concat.mp4

Hope that helps!

JULY 23, 2021

This page is incomplete.

I'd love to be able to export asciinema recordings as gif animations and mp4 videos. The creators don't see a point in doing this, as converting text-based recordings into image-based animations goes against asciinema's raison d'etre, but I would find it super useful to be able to include small snippets of recordings on Keynote or PowerPoint presentation slides.

JULY 15, 2021

Here are a few helper functions to list Lambda functions and layers (and to count them) using the AWS Command Line Interface (AWS CLI) to inspect the serverless resources of your Amazon Web Services (AWS) account.

Listing Lambda Layers of a Function

aws lambda get-function --function-name {name|arn} | \
jq .Configuration.Layers
[
  {
    "Arn": "arn:aws:lambda:us-west-2:00000000:layer:layer-name:1",
    "CodeSize": 1231231
  }
]

Counting Lambda Layers of a Function

aws lambda get-function --function-name {name|arn} | \
jq '.Configuration.Layers | length'
# Returns 1 (or number of layers attached to function)

Counting Lambda Layers in an AWS account

aws lambda list-layers | \
jq '.Layers | length'
# Returns 4 (or number of layers in your account)

Listing All Layers in an AWS account

aws lambda list-layers
{
    "Layers": [
        {
            "LayerName": "layer-name",
            "LayerArn": "arn:aws:lambda:us-west-2:0123456789:layer:layer-name",
            "LatestMatchingVersion": {
                "LayerVersionArn": "arn:aws:lambda:us-west-2:0123456789:layer:layer-name:1",
                "Version": 1,
                "Description": "Layer Description",
                "CreatedDate": "2021-07-14T14:00:27.370+0000",
                "CompatibleRuntimes": [
                    "python3.7"
                ],
                "LicenseInfo": "MIT"
            }
        },
        {
            "LayerName": "another-layer-name",
            "LayerArn": "arn:aws:lambda:us-west-2:0123456789:layer:another-layer-name",
            "LatestMatchingVersion": {
                "LayerVersionArn": "arn:aws:lambda:us-west-2:0123456789:layer:another-layer-name:4",
                "Version": 4,
                "Description": "Layer Description",
                "CreatedDate": "2021-07-14T11:41:45.520+0000",
                "CompatibleRuntimes": [
                    "python3.6"
                ],
                "LicenseInfo": "MIT"
            }
        }
    ]
}

Listing Lambda Functions in an AWS account

aws lambda list-functions
{
    "Functions": [
        {
            "FunctionName": "function-name",
            "FunctionArn": "arn:aws:lambda:us-west-2:0123456789:function:function-name",
            "Runtime": "python3.7",
            "Role": "arn:aws:iam::0123456789:role/role-name",
            "Handler": "lambda_function.lambda_handler",
            "CodeSize": 1234,
            "Description": "Function description.",
            "Timeout": 30,
            "MemorySize": 128,
            "LastModified": "2021-07-14T16:48:19.052+0000",
            "CodeSha256": "28ua8s0aw0820492r=",
            "Version": "$LATEST",
            "Environment": {
                "Variables": {
                }
            },
            "TracingConfig": {
                "Mode": "PassThrough"
            },
            "RevisionId": "1b0be4c3-4eb6-4254-9061-050702646940",
            "Layers": [
                {
                    "Arn": "arn:aws:lambda:us-west-2:0123456789:layer:layer-name:1",
                    "CodeSize": 1563937
                }
            ],
            "PackageType": "Zip"
        }
    ]
}

JULY 8, 2021

This page is incomplete

An option is to set CACHE=array on your .env file.

I need to learn why this happens and what other alternatives there are to avoid this error.

This happened to me when installing laravel-geoip https://github.com/Torann/laravel-geoip/issues/123

It makes sense that you don't want to change your cache type if you're using file or database to array simply to use a function in this package.

Other users recommended to publish tarenn/geoip's config file and disable caching and tagging.

# Publish the configuration file
php artisan vendor:publish --provider="Torann\GeoIP\GeoIPServiceProvider" --tag=config
# Copied File [/vendor/torann/geoip/config/geoip.php] To [/config/geoip.php]
# Publishing complete.
    // ...
    'cache' => 'none', // defaults to 'all'
    // ...
    'cache_tags' => [], // defaults to // ['torann-geoip-location']
    // ...

Another option is to conditionally set the tag.

    // ...
    'cache_tags' => env('CACHE_DRIVER') == "array" ? ['torann-geoip-location'] : null,
    // ...

JUNE 30, 2021

I get this error after running docker run --rm -it IMAGE_TAG.

The problem was that the image I was using wasn't really an image to execute in Docker but a set of steps to build a Python wheel package (.whl).

The solution was to only build the image specifying an --output directory to which the resulting wheel file could be copied.

DOCKER_BUILDKIT=1 docker build --output folder_to_save_wheel .

JUNE 19, 2021

To avoid ImageMagick from interpolating pixels when you want a sharp resize method (equivalent to PIL's Image.NEAREST_NEIGHBOR) you can use mogrify and set the -filter to point.

# Assuming we're upscaling an image smaller than 2000x2000 pixels
mogrify -resize 2000x2000 -filter point image.png

MAY 17, 2021

dyld: Library not loaded: /usr/local/opt/openldap/lib/libldap-2.4.2.dylib
dyld: Library not loaded: /opt/homebrew/opt/icu4c/lib/libicuio.68.dylib
  Referenced from: /opt/homebrew/bin/php
  Reason: image not found
zsh: abort      composer

Install (or update) the Xcode developer tools.

xcode-select --install

Reinstall icu4c.

brew reinstall icu4c

Make sure no errors prevent Homebrew from installing icu4c properly. For instance, I had to remove a few php folders and re-run the brew reinstall icu4c command.

sudo rm -rf /opt/homebrew/Cellar/php@7.4/7.4.15
sudo rm -rf /opt/homebrew/Cellar/php/8.0.2

MAY 13, 2021

Here's a way to map a given color in a Pillow image (PIL.Image) to another color. This is not the fastest method and it will only replace exact matches.

In this example, we're turning all black pixels in the input image (0,0,0) with blue (0,0,255).

from PIL import Image
import numpy as np

img = Image.open('/path/to/image.png')
img[np.where((img==[0,0,0]).all(axis=2))] = [0,0,255]
img.show()

MAY 13, 2021

from PIL import Image

img = Image.open('/path/to/image')

left = 10
top = 20
right = 10
bottom = 20

img = img.crop((left, top, right, bottom))

MAY 13, 2021

import json

file = open('my-file.json')
obj = json.load(file)

MAY 8, 2021

After CVAT is running and you have access to its login screen, you need to have an admin account to log in and access the admin panel.

You create an admin user from the command-line interface.

docker exec -it cvat bash -ic 'python3 ~/manage.py createsuperuser'

This command will as you for a username, email, and password.

After you log in, you can hover your username (in the top-right corner of the screen) and select "Admin page" to access the Django admin panel, where you can manage your CVAT site, manage users, groups, and more.

MAY 1, 2021

When you use two-factor authentication to sign in to your Gmail account (or to "Sign in with Google") you access your account with your email, password, and a verification code generated by Google Authenticator or other authenticator apps (such as Duo).

You might get an error like the one that follows when trying to sign in to Gmail with your Google password.

Authentication failed. Please check your username/password and Less Secure Apps access for mail@example.com.
Server returned error: "534-5.7.9 Application-specific password required. Learn more at 534 5.7.9 https://support.google.com/mail/?p=InvalidSecondFactor l25sm248619lfe.188 - gsmtp , code: 534"

When the service you're trying to use your Gmail account with doesn't allow you to "Sign in with Google," you need to create an app-specific password as detailed in the support Url provided by the error message.

Create a Google App Password

This app password

  • Go to your Google account
  • Security
  • Sign in to Google
  • App passwords
  • Choose the service type — e.g., Mail, Calendar, Contacts, YouTube, or Other (custom)
  • Choose the device type — e.g., iPhone, iPad, Mac, Windows, etc.
  • Generate

You'll get an app-specific password like this one — dbkdwckcplvgaktc — that will let you log in to the authorized service with your email and this password.

In my case, I use this password to be able to "Send as" from Gmail from an email address that has two-factor authentication turned on.

APRIL 28, 2021

import torch
print(torch.__version__)

APRIL 20, 2021

cd /path/to/repo.git
sudo chgrp -R {groupname} .
sudo chmod -R g+rwX .
find . -type d -exec chmod g+s '{}' +

Source

LAST UPDATED MAY 20, 2021

If you're receiving this error when trying to composer install.

Your GitHub OAuth token for github.com contains invalid characters

Updating Composer

2021.05.20 · Update

The solution is to update Composer to the latest version, which supports the new token format, as suggested by Jordi Boggiano on this tweet. "Composer 1.10.21 and 2.0.12 (both released April 1st) added support for the new GitHub token format."

As of this writing, the following command will install the latest version of Composer on your machine (i.e., 2.0.13). Note that future Composer updates will break the script as shown here, as the hash check won't pass.

php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
php -r "if (hash_file('sha384', 'composer-setup.php') === '756890a4488ce9024fc62c56153228907f1545c228516cbf63f885e036d37e9a59d27d63f46af1d4d07ee0f76181c7d3') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;"
php composer-setup.php
php -r "unlink('composer-setup.php');"

On macOS, you can use Homebrew to install (or reinstall) composer.

brew install composer
brew reinstall composer

The brute-force fix

2021.04.20

As I mentioned above, both Lukas Kahwe Smith and Jordi Boggiano discouraged tinkering with Composer's auth.json file manually and recommended upgrading Composer to its latest version instead.

Still, here's the brute-fox fix that worked for me. Apparently, editing the auth.json is the only way to update to the latest Composer programmatically, and you can revert it to its original state if you opt for this option. The alternative, of course, is to upgrade as shown above.

Edit the composer authentication configuration file ~/.composer/auth.json.

nano ~/.composer/auth.json

Then replace the following.

  "github-oauth": {
    "github.com": "ghp_[YOUR-PERSONAL-TOKEN]"
  }

With this (basic auth):

  "http-basic": {
    "github.com": {
      "username": "[YOUR-GITHUB-USERNAME]",
      "password": "ghp_[YOUR-PERSONAL-TOKEN]"
    }
  }

Source

Thanks

To Lukas Kahwe Smith and Jordi Boggiano for pointing this out on Twitter.

APRIL 20, 2021

I found this error while trying to update and install composer packages with composer install.

could not find driver (SQL: select * from information_schema.tables where table_schema = folio_burns and table_name = folio_items and table_type = 'BASE TABLE')

At first, I thought the solution was to edit /etc/php/7.4/cli/php.ini (for PHP-FPM 7.4 in my case) and uncomment the line ;extension=pdo_mysql to be like extension=pdo_mysql . But I was still getting this error as the mysql extension was missing.

PHP Warning:  PHP Startup: Unable to load dynamic library 'pdo_mysql' (tried: /usr/lib/php/20190902/pdo_mysql (/usr/lib/php/20190902/pdo_mysql: cannot open shared object file: No such file or directory), /usr/lib/php/20190902/pdo_mysql.so (/usr/lib/php/20190902/pdo_mysql.so: cannot open shared object file: No such file or directory)) in Unknown on line 0

The solution ended up being to install the extension, which would also add its own .ini file and activate itself on installation.

sudo apt-get install -y php7.4-mysql

Note that you can run this command with multiple extensions to be installed at once.

sudo apt-get install -y php7.4-{xml,bcmath,gd,mbstring,xsl,zip,curl,mysql}

APRIL 15, 2021

After downloading a website as HTML with cURL or any other workflow, you can convert the HTML code to the Markdown syntax with pandoc.

pandoc -o output.md input.html

APRIL 15, 2021

You can download any website as an HTML file (without the site's assets) using cURL in the command line, using the -L flag to follow any existing redirects.

curl -L https://nono.ma --output nono-ma.html

The manual alternative is to right-click on a website on your browser of choice (say, Google Chrome or Firefox), select Save As.., and save the site as HTML with some of its assets in a subfolder.

Afterwards, you can convert the downloaded HTML page into a Markdown document with pandoc.

APRIL 14, 2021

Here's how to get the raw String value of a Stringable object. Laravel's Illuminate\Support\Stringable has lots of helper functions, but sometimes you want to get the raw string value. For that, you just need to use the strval PHP built-in function on an object of the Stringable class.

// Define Strintable object
$stringable = Str::of('laravel-stringable-to-string');
get_class($stringable); // returns Illuminate\Support\Stringable
gettype($stringable); // returns object

// Get its raw String value
$string = strval($stringable);
get_class($string); // returns PHP Warning:  get_class() expects parameter 1 to be object, string given in […]
gettype($string); // returns string

Want to see older publications? Visit the archive.

Listen to Getting Simple .