Nono.MA

JULY 21, 2023

Looking for products on Amazon is a loophole.

Amazon prices are competitive. They often match offers from other vendors automatically. In Spain, for instance, they match MediaMarkt's discounts to the cent.

The issue is that offers and discounts are not always real discounts. Amazon may have a 100-euro product listed at 60 euros (40% discount) when in reality, everyone's selling the product at that price now, and 100 euros was when the product was released. It's not a discount; it's the product's current price.

What I want to do is remember the different Amazon prices I saw when visiting the product or even adding it to the shopping cart.

The idea is to create a Google Chrome extension to track the price of an Amazon page when I visit it, creating a log of actual prices. That way, when you return, you can know if that fifty percent off is an actual sale.

LAST UPDATED JUNE 3, 2020

2020.06.03

I've found that if the creation or starting of a notebook takes longer than 5 minutes the notebook will fail, plus re-creating the conda environment every time you start an existing notebook makes the wait really long. Another solution which I'm preferring now is to use these persistent-conda-ebs scripts—on-create.sh and on-start.sh—provided by Amazon Sagemaker as examples. To keep it short, they download Miniconda and create an environment on-create with whatever Python version you choose, you can customize your environment (say, installing Python packages with pip or conda inside of it), and then that environment is persistent across sessions and future starts that will run the on-start script and have your notebook running in 1–2 minutes. Hope that helps! That's the way I'm using lifecycle configurations now.


2020.03.24

Here's something I learned about Amazon SageMaker today at work.

You can create notebook instances with different instance types (say, ml.t2.medium or ml.p3.2xlarge) and use a set of kernels that have been setup for you. These are conda (Anaconda) environments exposed as Jupyter notebook kernels that execute the commands you write on the Python notebook.

What I learned today that I didn't know is that you can create your own conda environment and expose them as kernels so you're not limited to run with the kernels offered by Amazon AWS.

This is the sample environment I setup today. These commands should be run on a Terminal window in a SageMaker notebook but they most likely can run on any environment with conda installed.

# Create new conda environment named env_tf210_p36
$ conda create --name env_tf210_p36 python=3.6 tensorflow-gpu=2.1.0 ipykernel tensorflow-datasets matplotlib pillow keras

# Enable conda on bash
$ echo ". /home/ec2-user/anaconda3/etc/profile.d/conda.sh" >> ~/.bashrc

# Enter bash (if you're not already running in bash)
$ bash

# Activate your freshly created environment
$ conda activate env_tf210_p36

# Install GitHub dependencies
$ pip install git+https://github.com/tensorflow/examples.git

# Now you have your environment setup - Party!
# ..

# When you're ready to leave
$ conda deactivate

How do we expose our new conda environment as a SageMaker kernel?

# Activate the conda environment (as it has ipykernel installed)
$ conda activate env_tf210_p36

# Expose your conda environment with ipykernel
$ python -m ipykernel install --user --name env_tf210_p36 --display-name "My Env (tf_2.1.0 py_3.6)"

After reloading your notebook instance you should see your custom environment appear in the launcher and in the notebook kernel selector.

What if you don't want to repeat this process over and over and over?

You can create a lifecycle configuration on SageMaker that will run this initial environment creation setup every time you create a new notebook instance. (You create a new Lifecycle Configuration and paste the following code inside of the Create Notebook tab.)


#!/bin/bash

set -e

# OVERVIEW
# This script creates and configures the env_tf210_p36 environment.

sudo -u ec2-user -i <<EOF

echo ". /home/ec2-user/anaconda3/etc/profile.d/conda.sh" >> ~/.bashrc

# Create custom conda environment
conda create --name env_tf210_p36 python=3.6 tensorflow-gpu=2.1.0 ipykernel tensorflow-datasets matplotlib pillow keras -y

# Activate our freshly created environment
source /home/ec2-user/anaconda3/bin/activate env_tf210_p36

# Install git-repository dependencies
pip install -q git+https://github.com/tensorflow/examples.git

# Expose environment as kernel
python -m ipykernel install --user --name env_tf210_p36 --display-name My_Env_tf_2.1.0_py_3.6

# Deactivate environment
source /home/ec2-user/anaconda3/bin/deactivate

EOF

That way you won't have to setup each new notebook instance you create. You'll just have to pick the lifecycle you just created. Take a look at Amazon SageMaker notebook instance Lifecycle Configuration samples.

MARCH 21, 2015


The US Federal Aviation Administration (FAA) has recently approved a plan that gives Amazon a green light to test their Amazon Prime Air delivery service — developed to ship packages [under 2,25 kilograms] using drones.

After numerous complaints and warnings about the plans presented by Amazon back in 2013, the FAA established a series of rules and regulations for Amazon to test the viability of their intended services:

Under the provisions of the certificate, all flight operations must be conducted at 400 feet [120 meters] or below during daylight hours in visual meteorological conditions. The UAS must always remain within visual line-of-sight of the pilot and observer. The pilot actually flying the aircraft must have at least a private pilot’s certificate and current medical certification.

The video in the top shows an example of how the delivery service would work — a service that, as Amazon argues, would be extremely useful for shippings to close areas and deliveries to inaccessible locations.

For those who have doubts about this new technology, Amazon provided a Q&A sections on their Amazon Prime Air marketing site:

Q: Is this science fiction or is this real?

A: It looks like science fiction, but it's real. One day, seeing Prime Air vehicles will be as normal as seeing mail trucks on the road.

Q: When will I be able to choose Prime Air as a delivery option?

A: We will deploy when and where we have the regulatory support needed to realize our vision. We’re excited about this technology and one day using it to deliver packages to customers around the world in 30 minutes or less.

Q: How are you going to ensure safety?

A: Safety is our top priority, and our vehicles will be built with multiple redundancies.

Q: What will the Prime Air delivery vehicles look like?

A: It is too soon to tell. We are testing many different vehicle components, designs and configurations.

Q: Where are you building and testing?

A: We have Prime Air development centers in the United States, the United Kingdom and Israel, and we are testing in multiple international locations.

Sources | NYTimes | PopularScience

Want to see older publications? Visit the archive.

Listen to Getting Simple .