Tuesday, June 26, 2018

Next.js (contemporary of Socket.io?)

I just read an article describing another real-time javascript library, next.js.  I have enjoyed using socket.io for sometime now.  It's easy to overlook competitors when you're happy with what you know, love and use.  I'm curious as to what advantages next.js brings to the table and may be a point of study in the near future.

That being said, it's always fun learning something new.

https://codeburst.io/build-a-chat-app-with-sentiment-analysis-using-next-js-c43ebf3ea643

This leads me to wonder what other real-time libraries might be competitors to Socket.io and Next.js.

Monday, June 25, 2018

Custom Properties (Singel pattern)

I'm always trying to find ways to make my code more maintainable.  How many times have you come back to your code 2 months down the road only to ask yourself, "What's going on here?"  Admittedly, I have a short term memory, so it's definitely a good idea to use techniques that create self-documenting code, re-use common patterns and make it easy to hop in and add features without necessitating a 4 hour audit of the code first.

I recently read an article by Diego Haz that suggests that one favor creating more components versus simply passing in custom properties to determine how a component gets rendered.
Unless you're creating a well-documented open source library, resist that. Besides introducing the need of documentation, it's not scalable and will lead to unmaintainable code. Always try to create a new single element — such as AvatarRounded — which renders Avatar and modifies it, rather than adding a custom prop.
Diego admits that custom properties are not evil, but should only be used after carefully considering their need.

Wednesday, September 28, 2016

Unblock images previously blocked on Firefox

Have you ever accidentally clicked "Block Image" while using Firefox and you didn't mean to?  I'm not sure if this is a bug on Mac OS Firefox or a more broad issue, but the documented solutions don't seem to be working for me.
  1. click on page info, media tab and make sure that "Block Images" is unchecked. It doesn't work for me.
  2. used the Firefox sqlite extension to browse through permissions.sqlite to try and find the blocked file and change its permissions; no luck there either. I even shut down Firefox,  renamed permissions.sqlite, and started up Firefox which creates a new permissions.sqlite db.  Still, the image was blocked.
Then I found a solution that does work.
  1. shut down Firefox
  2. navigate to your Firefox profile directory (e.g., '~/Library/Application Support/Firefox/Profiles/nge0hgfo.default')
  3. open './jetpack/image-blocker@erikvold.com/simple-storage/store.json' and remove the offending entries
  4. save file and close; start up Firefox again
The previously blocked images should be showing up again.
 https://gist.github.com/dcvezzani/b8457b3045cc648cd9d0e560a893446e

Wednesday, September 16, 2015

Forcing git diff to compare non-text files

While trying to figure out how to make Git treat my uncompressed Pdf documents as text (and not binary), I found the following resources:

http://stackoverflow.com/questions/13486027/how-can-i-treat-a-file-not-a-binary-git
http://stackoverflow.com/questions/14771414/git-force-file-encoding-on-commit

It appears there is a way to customize diff.  I haven’t tried it myself yet, but plan to in the future.

https://git-scm.com/book/en/v2/Customizing-Git-Git-Attributes

Here’s a tool that can be used to `git diff` MS Word files.

http://docx2txt.sourceforge.net/

Thursday, May 21, 2015

Change Mac Calendar all-day event reminder time to something other than 9am

Kudos for the many who have provided those undocumented or not-well-known features for the Mac Calendar.  There are many.

Scenario: I'm working productively when suddenly every day I get a dump of 15 event notifications that end up blocking the view of what I'm working on.  My colleagues work on EDT while I'm on PDT.  By 9am, I'm already in the thick of my day.  I would rather the all day events not interrupt me at that time.  Five Awesome Tips And Tricks To Master OS X Calendar [Feature] provides some guidance on how to change that as the Calendar preferences are limited; they expect everyone wants to be notified at 9am.

As mentioned in the article, you can override the "trigger" time for all day events by editing

# find all files under ~/Library/Calendars/
# whose path includes LocalDefaultAlarms
# whose name does not include EventTimed
# open in favorite editor

grep -rl TRIGGER ~/Library/Calendars/ | grep LocalDefaultAlarms | grep -v EventTimed | xargs mvim -p
Simply edit the configuration files to the desired trigger time specified. Hours can be negative or positive and represent the offset from midnight of the day of the all-day event. For me, instead of being reminded at 9am the day before, I would rather be reminded at 7am instead, which turns out to be 17 hours before midnight for the all-day events.

# if you vim

:%s/PT15H/PT17H/g
If you want to get fancy, you could come up with a bash script that uses sed so the files don't even need to be opened in a text editor.

Friday, March 13, 2015

Launching a shell script as an OSX application

Go through this helpful tutorial to get started on understanding how Mac's Automator can make life easier. Look for the section labeled, "Using Variables from your Workflow in AppleScript".

To tidy things up, close the terminal window when we're done.

E.g., I have a script that opens up a file in a text editor and then runs a Ruby script using that same file as input.

[~/scripts/create-reminders.rb]
  
  #!/bin/bash

  home='/Users/dvezzani/Documents/journal/02-feb-2015'
  ruby ${home}/create-ics-reminders.rb > out-reminders.ics
  open -a Reminders out-reminders.ics
[~/Library/Services/create-reminders.workflow]

  on run {input, parameters}

    tell application "Terminal"
      activate
      do script "sh /Users/dvezzani/scripts/create-reminders.sh; exit"
    end tell

    return input
  end run

Then export the workflow as an Application and place in your Applications directory so that you can pull it up with Spotlight.

Other references:

Monday, February 23, 2015

Searching for text in files whose filenames have spaces

Thank you, StackOverflow!

I have been dealing with spaces in filenames and it's been a thorn in my side.

Finally, some relief:


find idmexchange/ -name "*.xml" | while read file
do
  grep -l "XPRESS" "${file}"
done

The difference between word-based iteration and line-based iteration makes all the difference.

Saturday, January 10, 2015

Upgrading PHP, PhpPgAdmin after Mac Yosemite upgrade

Prepare Homebrew

  
brew update
brew doctor

Resolve pending issues, upgrade to latest version of XCode, etc.

Install PostgreSql

It doesn't come any easier than this.

http://postgresapp.com/

Update PHP

First using Homebrew. I had a typo the first time. The PostgreSql extention didn't get generated. Make sure to read the caveats at the end of the process!

  
brew tap --repair
brew tap homebrew/homebrew-php
brew install php56 --with-apache --with-mysql --with-postgresql --with-intl

Point php.ini to the right version.

  
sudo mv /etc/php.ini /etc/php.ini.pre-update
sudo ln -s /usr/local/etc/php/5.6/php.ini /etc/php.ini

Perhaps without any typos, the PostgreSql php extension gets generated. If the PostgreSql php extension doesn't get generated, these manual instructions are great. Especially if you installed PostgreSql as mentioned earlier in this document, you will need to specify where the installation directory is when you call ./configure. It should look something like this:

  
./configure --with-pgsql=/Applications/Postgres.app/Contents/Versions/9.4

http://stackoverflow.com/questions/6588174/enabling-postgresql-support-in-php-on-mac-os-x

Once you know where the pgsql.so extension resides, make sure it is included in php.ini. E.g.,

  
; Mac Extensions
extension=/usr/local/Cellar/php56/5.6.4/lib/php/extensions/no-debug-non-zts-20131226/pgsql.so

Edit Apache's httpd.conf and make sure that the php and rewrite modules are enabled. Also, you will likely need to update the path to the correct php module. You should use the new libphp5.so that got generated when brew installed php.

  
LoadModule rewrite_module libexec/apache2/mod_rewrite.so
LoadModule php5_module /usr/local/opt/php56/libexec/apache2/libphp5.so

Test PHP

Before going to far, take a moment to verify that php is working well with Apache.

Keep an eye on Apache's error log file during this process. Run this in a separate terminal window.

  
tail -f /private/var/log/apache2/error_log

Create a test index.php file. Note, the recent versions of php require "<?php" and not simply the shortform, "<?".

  
echo "<?php phpinfo(); ?>" > /Library/WebServer/Documents/index.php
open http://localhost/index.php

Troubleshooting PHP

If you are having trouble simply installing php using Homebrew, this may be useful:

https://github.com/Homebrew/homebrew-php#common-upgrade-issues

The first bullet might be satisfied with

  
xcode-select --install

Thanks, joshlopes; https://github.com/Homebrew/homebrew-php/issues/997

Build problems with phar.php?

/bin/sh: line 1: 92714 Segmentation fault... 
make: *** [ext/phar/phar.php] Error 139

I'm not clear on what SNMP (some kind of mobile protocol?) is, but there is clear evidence that it gives some general grief with installing php. I resolved it by including --without-snmp in the call to brew install. I have updated this command in the previous instructions.

If you don't see an html page with a dump of php configuration information, it's time to have some fun debugging. What follows are somethings that I needed to do in order to stop getting a white screen.

Go through this set of steps.

http://stackoverflow.com/questions/5121495/php-code-is-not-being-executed-i-can-see-it-on-source-code-of-page

If php still doesn't seem to be running with Apache, I also found this useful to get brew install php56 to succeed. Again, this seems to be fallout from installing PostgreSql the way I did.

  
sudo mv /usr/lib/libpq.5.dylib /usr/lib/libpq.5.dylib.pre.php56.install
  sudo ln -s /Applications/Postgres.app/Contents/Versions/9.4/lib/libpq.5.dylib /usr/lib/libpq.5.dylib
  sudo apachectl restart
  sudo apachectl configtest

https://github.com/Homebrew/homebrew-php/issues/1489

Once you are able to view the php information in the index.php page and you have included the pgsql.so extension in php.ini, installing phppgadmin should be easy.

Install phppgadmin

  
brew install phppgadmin

Follow the instructions that follow after this runs.

Other

To set the timezone in php.ini, see the php manual for a list of supported values.

  
[Date]
  ; Defines the default timezone used by the date functions
  ; http://php.net/date.timezone
  date.timezone = "America/Los_Angeles"

The Apache log file may complain about ServerName missing from httpd.conf. Follow the instructions to stop it from complaining.

  
ServerName localhost:80

Add mime-type for php in the mime-types file instead of as a command in httpd.conf.

  
sudo mvim /etc/apache2/mime.types

  application/x-httpd-php            php

Update PATH to support pear and PostgreSql. Edit .bash_profile.

  
# add pear package manager (php)
  export PATH="/usr/local/pear/bin:$PATH"

  # add postgresql app
  export PATH="/Applications/Postgres.app/Contents/Versions/9.4/bin:$PATH"

Location of DocumentRoot

  
cd /Library/WebServer/Documents

Location of Apache

  
cd /etc/apache2

Edit Apache configuration file

  
sudo mvim /private/etc/apache2/httpd.conf

Edit the php configuration file

  
sudo mvim /etc/php.ini

View the Apache logs

  
cd /private/var/log/apache2
  tail -f /private/var/log/apache2/error_log

Thursday, January 8, 2015

Getting to know Atom

A new application can be as exciting as Christmas morning. I am a dedicated Vim user (specifically MacVim). I am grateful for those who have developed plugins to bring a Vim-like experience to the tools I use.

Eclipse has Vrapper and "Vrapper - Surround". I never quite made it on the Sublime bandwagon. Nic Raboy recently filled me in on this great tool. It does everything that Sublime does (almost), but faster. And because it was developed by the folks at GitHub, it provides some great integration with the well know source control giant. The name? The Atom editor.

My addiction to Vim may backfire on me one day. Apparently, that day hasn't come yet; Atom has a couple of plugins that satisfy my Vim addiction.

  • open-vim
  • vim-mode

The interface is slick and tasty. And I love the support for creating your own plugins using JavaScript and/or CoffeeScript.

I'll be spending the next little bit getting familiar with the documentation. The IDE also has a set of very useful developer tools that help with debugging the package scripts that are included in plugins. That's one thing Vim doesn't have!

  • View > Developer > Toggle Developer Tools

Friday, December 19, 2014

Java: Https clients, headers and cookies

Sample HTTPS client

Processing the input stream is more reliable than using the #getContent() method.

http://alvinalexander.com/blog/post/java/simple-https-example

  package foo;
   
  import java.net.URL;
  import java.io.*;
  import javax.net.ssl.HttpsURLConnection;
   
  public class JavaHttpsExample
  {
    public static void main(String[] args)
    throws Exception
    {
      String httpsURL = "https://your.https.url.here/";
      URL myurl = new URL(httpsURL);
      HttpsURLConnection con = (HttpsURLConnection)myurl.openConnection();
      InputStream ins = con.getInputStream();
      InputStreamReader isr = new InputStreamReader(ins);
      BufferedReader in = new BufferedReader(isr);
   
      String inputLine;
   
      while ((inputLine = in.readLine()) != null)
      {
        System.out.println(inputLine);
      }
   
      in.close();
    }
  }
Additional references:

Url parameters (GET and POST)

Depending on the method for the request, there are a couple different ways that url parameters can be configured. GET requests will include the parameter names and values in the url itself. POST will include the parameter names and values in a request header.

GET


  String http_url = "http://127.0.0.1:3771/greeting/test";

  String charset = "UTF-8";  // Or in Java 7 and later, use the constant: java.nio.charset.StandardCharsets.UTF_8.name()
  String param1 = "value1";
  String param2 = "value2";

  URL url;
  try {

    String query = String.format("param1=%s¶m2=%s",
           URLEncoder.encode(param1, charset),
           URLEncoder.encode(param2, charset));

    url = new URL(http_url + "?" + query);
    HttpURLConnection con = (HttpURLConnection)url.openConnection();
    con.setRequestProperty("Accept-Charset", charset);
    ...

  } catch (Exception e) {
		e.printStackTrace();
  }

POST


  http_url = "http://127.0.0.1:3771/greeting/shake.json";

  String charset = "UTF-8";  // Or in Java 7 and later, use the constant: java.nio.charset.StandardCharsets.UTF_8.name()
  String param1 = "value1";
  String param2 = "value2";

  URL url;
  try {
    url = new URL(http_url);

    con = (HttpURLConnection)url.openConnection();
    con.setRequestMethod("POST");
    con.setRequestProperty("Content-Type", "application/x-www-form-urlencoded;charset=" + charset);
    con.setDoOutput(true);

    String query = String.format("param1=%s¶m2=%s",
           URLEncoder.encode(param1, charset),
           URLEncoder.encode(param2, charset));

    try (OutputStream output = con.getOutputStream()) {
        output.write(query.getBytes(charset));
    }
    ...
    
  } catch (Exception e) {
		e.printStackTrace();
  }

Additional references:

The nature of cookies

Cookies are only set by the server; any modifications to the cookies on the client side are ignored (at least when using java.net's CookieManager class). Cookies are stored on the client side and the client may read them.

http://curl.haxx.se/rfc/cookie_spec.html

  HttpCookie cookie = new HttpCookie("blah", "bleh");

  url = new URL(http_url); 
  cookieJar.add(url.toURI(), cookie);
  HttpURLConnection con = (HttpURLConnection)url.openConnection();

Accessing Cookies, both retrieval and assignment.

HTTP response code: 411

When sending a POST request to a server, there must be a non-null body provided, even if it is an empty string or an exception will be thrown. The behavior for other libraries may differ, but for the java.net package, this is the case.


  java.io.IOException: Server returned HTTP response code: 411 for URL: http://127.0.0.1:3771/greeting/shake.json

You will need to write to the output stream to overcome this problem. Doing so automatically creates the "Content-Length" request header.


  con.setRequestMethod("POST");

  con.setDoOutput(true);
  BufferedWriter bf = new BufferedWriter(new OutputStreamWriter(con.getOutputStream()));
  bf.write("");
  bf.flush();
Additional references:

Setting request headers and their respective values

http://stackoverflow.com/questions/6469540/setting-custom-http-request-headers-in-an-url-object-doesnt-work

  URL url = new URL("http://myipcam/snapshot.jpg");
  URLConnection uc = url.openConnection();
  uc.setRequestProperty("Authorization", 
    "Basic " + new String(Base64.encode("user:pass".getBytes())));

  // outputs "null"
  System.out.println(uc.getRequestProperty("Authorization"));
http://stackoverflow.com/questions/12731211/pass-cookies-from-httpurlconnection-java-net-cookiemanager-to-webview-android

  webCookieManager.setAcceptCookie(true);

Print out all header keys and their respective values.

http://www.mkyong.com/java/how-to-get-http-response-header-in-java/

	//get all headers
	Map> map = conn.getHeaderFields();
	for (Map.Entry> entry : map.entrySet()) {
		System.out.println("Key : " + entry.getKey() + 
                 " ,Value : " + entry.getValue());
	}
 
	//get header by 'key'
	String server = conn.getHeaderField("Server");

Ruby/Rails: access cookies

http://api.rubyonrails.org/classes/ActionDispatch/Cookies.html

  cookies[:user_name] = "david"
  cookies.signed[:user_id] = current_user.id

Ruby/Rails: access headers

http://stackoverflow.com/questions/19972313/accessing-custom-header-variables-in-ruby-on-rails

  request.headers['custom_header']
Now custom variables are always prepended with HTTP_ ... except for CGI variables Harsh Gupta

Thursday, October 9, 2014

The Death of a Hard Drive


Every had that moment you walk into the office and see a white/gray screen with a folder and a question mark on your iMac? Or better, how about a blank white/gray screen? Oh, yes. Your hard drive isn't feeling too good.

At this point you grab your laptop (that is working) and you start doing searches to try and see how to rescue your computer. In my case, I had a pretty good hunch it wasn't the processor chip.

  1. the screen came on
If the screen doesn't light up, especially if the BIOS screen doesn't come up (for Windows), you might be looking at a bad CPU. Sure, there are several possibilities at this point, but the next logical step I would take is to check the health of the hard drive.

Check the health of the disk

  1. get an installation disk or thumb drive that you can boot off of; if using the installation disk, insert the disk while the computer is on
  2. turn off the computer by holding the power button until it powers off
  3. turn on the computer, wait for the initial chime and hold the Option key down until you are prompted to select which device to boot from (there are other key combinations as well); boot from the installation disk or appropriate usb device
Once you are booted external to the potentially fouled up hard drive, you want to use Disk Utility to check on the internal hard drive's health. With the installation disk, you may feel startled by the new installation screens. Don't worry, there are instructions on the second screen you can follow to load Disk Utility and not install the OS.
Once Disk Utility has loaded, if you can see that the internal hard drive did indeed mount (and is not grayed out), you can run verify to see if there are problems with the hard drive.
If there are, and they are not simply permission issues, you have some choices to make.

OK, it's broken; what do I do now?

  1. Use the Repair option in the Disk Utility tool (I do not recommend this; I'll explain why later)
  2. plug in an external USB drive, close Disk Utility and open Terminal, using command line calls to copy data from the damaged hard drive to the external USB drive
  3. use a data recovery tool (like Data Rescue) to first clone the dying hard drive and then pull data from the clone
I would strongly recommend option 3 as you really don't know how much longer the hard drive will last.

My observations (for what they are worth)

Here are some things I have discovered with my last two data recovery sessions (hard drives on a 10 year old MacBook and iMac recently died).
  • the hard drive will need to be replaced; sure, you might be able to put a band-aid on the hard drive in question, but it is compromised and you will be left rescuing data from a dying hard drive again and soon.
  • repairing the drive involves changes to the hard drive that may end up leaving the stability of the system worse than it was before; if your computer is responding abnormally slow, this is another indication that your hard drive may be failing
  • performing a clean or defrag (even on a Windows VM hosted on a Mac) could have the same results as using Disk Utility's Repair
  • even with the data recovery tool, each time the disk was scanned, there were more reports of slow reads. This is why I strongly suggest doing a clone before doing anything else if you even suspect the drive might be dying. Then you use the data recovery to pull files from the cloned drive.
In my particular case, the installation disk method worked until I attempted to repair the disk. The results of doing so claimed that all was well, but when I defragged my Windows VM and rebooted the computer, I got the white screen again. Loading Disk Utility via the installation disk, this time the internal hard drive would not mount. And 5 more attempts did not bring forth any fruit.
At this point, I decided to take the hard drive out and use a USB hard drive connection tool meant for data rescue efforts. There seem to be plenty of YouTube videos that provide instruction on how to open up an iMac to remove the hard drive, so I checked them out. I ended up using a toilet plunger (it was clean) to suck the screen cover off since I didn't have the fancy tool. If you are opening up your iMac, you will also need the correct size star driver (the small, precision kind).
Once the hard drive is out, you will need a USB SATA/IDE converter. Most of these converters will have a power source as well, which you will need. I plugged my hard drive into the USB converter, plugged in the USB cable and fired up my data recovery software (e.g., Data Rescue seems to be a popular one), and performed a quick scan. It didn't take me long to realize I needed to do a clone first (which requires a dedicated external drive) as the reported list of slow reads got longer and longer. After performing a clone of the dying hard drive (which took 5 hours during the night and 365 slow read detections), I was finally able to rescue my data.

Prevention is the best medicine

Of course, the best course of action is to be backing up your important data on a frequent basis and there are many tools that can help. Hard drives will eventually fail -- it's just a fact. Rather than wait until a drive is failing, consider
  • using a tool like Carbon Copy Cloner and an external hard drive for each computer hard drive
  • use a wireless device (like Time Capsule) that all your computers can connect to and perform regular backups
  • use an online service like BackBlaze and your WiFi to backup your computer drives
I'm seriously considering using the online service option.
One popular option is BackBlaze which costs about $40 per computer a year. When you factor in the cost of additional external hard drives, the purchase of data rescue software and the long hours, agonizing over a dying hard drive, it might very well be worth it.

Thursday, September 25, 2014

using grouper to synchronize LDAP with Active Directory

My current assignment is to use Grouper to synchronize group memberships between an LDAP and an Active Directory (AD).  Even though AD is very much like an LDAP, it is not.  What's more we don't have control over making changes to AD because we are using the Microsoft Cloud.

We want to be able to use a single tool to handle group management and make those groups available via LDAP and AD.  AD does not support dynamic groups to the degree that we need, so we plan on including DN values explicitly for each group.  Our LDAP does support dynamic groups, which we are currently using.

Some applications connect to LDAP while others must connect to AD. We need a solution that handles the following:

  1. AD groups are provisioned with explicit lists of DN values
  2. LDAP DN values differ slightly from AD DN values and will require a transformation from “uid=dvezzani,...” to “cn=dvezzani,...”

In order to achieve this goal, we plan on primarily using the grouper-loader to pull in DN values from LDAP and psp to provision groups with the transformed DN values.

LDAP and AD subjects are being populated by separate means, but they both contain the same logical set of subjects. Is this the right approach to accomplish our goals?

Wednesday, August 20, 2014

JMeter on my Mac, Permission denied

In response to why JMeter didn't start up for me on my Mac. There are differences between the zip and tgz archive versions. In regards to the zip version...

That is intended for Windows systems, as the zip format does not allow one to set the executable bit for files.

http://jmeter.512774.n5.nabble.com/Jmeter-2-3-2-not-working-in-Linux-td535153.html - sebb-2-2

I downloaded the tgz file and JMeter works like a charm now.

Solar Power - hype or help?

So what's all the hype with solar power? You may hear things like
  • you won't every have to pay an electricity bill again
  • PG&E will cut you monthly checks for your excess power
  • you will be a hero because you'll be green

"Solar panels in Ogiinuur" by Chinneeb - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons.

I'm certainly not going to try and fill you in on everything that we've been learning. We just spent 4 hours with a wonderful solar power representative. Here are some things to watch out for.
  • small print indicating the company can simply walk away from the agreement at anytime, un-obligated, leaving you to manage the array of solar panels on your roof on your own
  • who will be monitoring the equipment, handling the maintenance of warrantied panels, replacing them at the end of 30 years, replacing the power inverter? You? Why not? You’ve got nothing better to do with your time than try to chase down licensed/bonded contractors that you trust to service your solar panels, right?
  • will the company still be around through the next 20 years? If not, will they leave you on your own to deal with the solar panels on your roof?
  • if you move, do you want to have the option of transferring your solar panels to your new house?
This list is far from exhaustive, but hopefully it gets you thinking about some important questions to ask if/when you meet with sales representatives. This is a big investment and, if you're like us, you don't have money to throw around. Make sure you do your home work and shop around.

There are some other flags that I find useful.
  • is the sales representative salary paid or strictly by commission? Knowing that the sales rep gets paid whether I commit or not takes some of the pressure off the meeting and allows more open and candid conversation
  • how many backers does the solar company have? Do you recognize the company names? What kind of capital investment does it have? A solar power company with $2 billion dollars behind it stands more chance of being around when your 20 contract is up.
One thing that caused my wife and I to cringe as we considered a contract with a solar power company was the bottom line of how much it would cost. But when you compare it to the best scenario cost of the current electricity expenditures, we end up saving $20 thousand dollars. Perspective is everything. Basically, the way the business model works is
  • you stop paying PG&E and start paying your monthly electricity to the solar power company.
  • the solar power company should review through your power usage over the past year and determine the average kilowatt hours you consume each month.
  • then a solar panel solution is determined that would provide the needed kilowatt hours
  • solar power is produced and a power inverter moves the power to PG&E's power grid and you continue powering your appliances from PG&E's power grid. It's not a perfect comparison, but think of the power grid as a giant battery that's being recharged, in part, by the solar panels on your roof.
  • if you use more kilowatt hours than your solar panels produce, you will simply pay PG&E for it at the end of the year.
  • if you use fewer kilowatt hours than your solar panels produce, that will adjust the bottom line of what you owe to PG&E (or the check that PG&E will write out to you!) at the end of the year.
There are many other details to consider, but you have the basic idea.

Don't get too excited of the prospect of simply putting extra panels on your roof and start your own power making business.
  • the exchange rate is abysmal; for every excess 15 kilowatt hours, PG&E may pay you for 3 kilowatt hours
  • there are probably some laws in place which discourage it
Still, just getting a check instead of a bill from PG&E should make anybody's day.
At the end of the day, most of us simply want to find ways to save a little money. So far, I am discovering that solar power would be a great way for our family to cut our current bill by more than half. I don't even need to pay anything initially for the solar panels, their installation or maintenance.

For those who have 150 thousand laying around, you will save even more! How does 20 years without an electricity bill sound? Yeah; I thought that too! Perhaps you can purchase solar panels for less. Make sure you understand what you get for your price. Does that purchase include service for the panels through the entire 20 years, even if the company goes out of business? Make sure you do your homework.

So what's all the hype with solar power? It's still relatively new, but the kinks are getting worked out and we can all start enjoying a more consistent and lower power bill thanks to solar technology for zero down. The devil is in the details and not all solar companies are the same. Meet with several companies and ask questions, like the ones in this post; compare and find which company works best for you and your situation.

We can all use a little more money in our budget. How about... now?

Thursday, June 5, 2014

'or'-ing terms with bash and grep

Needing to search for lines in a group of files having either of the supplied terms.

grep -e foo -e bar *.txt
With a more complete context:

  for file in $(ls aaa.txt bbb.txt)
  do
    echo ""
    echo $file
    echo "======================="
    cat $file | grep -e "^foo" -e "^bar"
  done
http://unix.stackexchange.com/questions/37313/how-do-i-grep-for-multiple-patterns

removing app from launchpad

Accidentally dragged an app onto Launchpad instead of the Applications directory.  Needed to remove it and found this:

http://osxdaily.com/2012/01/05/remove-apps-from-launchpad/

... but I had problems running the snippet as it was.  I ended up running something like this:


sqlite3 "/Users/john/Library/Application Support/Dock/1426ED5E-E2B5-43CF-9616-698ED687577D.db" "DELETE from apps WHERE title='SomeApp';" && killall Dock

Apparently, I had to provide a full path to the sqlite3 database.  Running 'ls' pulled up not one, but two entries.  The command to sqlite3 only takes a single database file.

Friday, May 2, 2014

quick and dirty SSL client using Ruby and Savon

I needed to test my Apache CXF web service using a particular SOAP request stored in a plain xml file. Since I am familiar with Ruby (and I'm always looking for excuses to use it), I used Savon as a tool to start from. I added the ability to provide a file whose contents is the desired SOAP request. I also needed to send my request via HTTPS and Savon was already able to accommodate (thanks to it's own use of HTTPI).

To use my forked version of savon:


gem 'savon', :git => 'https://github.com/dcvezzani/savon.git'

Once a Savon client has been created and initialized with the necessary WSDL and SSL configuration, the call is simple:


call_options = {body_content: body_content}
response = client.call(:soap_operation, call_options)

See the sample client for details.

Need to extract private and public keys out of a Java Keystore (JKS)? See my other post.

extracting private, public and trusted keys from a Java Keystore (JKS)

Provide your own particular values.

  • STORE_PASS: the keystore password
  • STORE_LOCATION: the absolute path for your keystore location
  • TARGET_ALIAS: the alias name for the targeted key in the keystore
  • WSDL_URI: the uri (or url) for the WSDL associated with the service that will be called
  • TRUSTED_ALIASES: list of alias values to be extracted out of JKS file and put into PEM formatted truststore file

  export STORE_PASS=
  export STORE_LOCATION=/Users/joe/...
  export TARGET_ALIAS=
  export WSDL_URI=https://...
  export TRUSTED_ALIASES=1("one" "two" "three")

Verify your environment variables


keytool -list -v -storepass ${STORE_PASS} -keystore ${STORE_LOCATION} | less

filename=$(basename "$STORE_LOCATION")
suffix="${filename##*.}"
prefix="${filename%.*}"
alias=${TARGET_ALIAS}

Export the public key


keytool -export -rfc -keystore ${STORE_LOCATION} -storepass ${STORE_PASS} -alias ${alias} -file ${prefix}.pub

Export the private key. This is necessary because some tools don't know how to use a Java keystore.


keytool -importkeystore -srckeystore ${STORE_LOCATION} -destkeystore ${prefix}.p12 -srcstoretype JKS -deststoretype PKCS12 -srcstorepass ${STORE_PASS} -deststorepass ${STORE_PASS} -srcalias ${alias} -destalias ${alias} -srckeypass ${STORE_PASS} -destkeypass ${STORE_PASS} -noprompt

openssl pkcs12 -in ${prefix}.p12 -out ${prefix}.key -passin pass:${STORE_PASS} -passout pass:${STORE_PASS}

openssl x509 -in ${prefix}.pub -text -noout
openssl rsa -in ${prefix}.key -check

Verify that private/public key pair were extracted correctly


export WGET_CONFIG="--secure-protocol TLSv1 --no-check-certificate --certificate ${prefix}.pub --certificate-type PEM --private-key ${prefix}.key --private-key-type PEM"

wget ${WGET_CONFIG} ${WSDL_URI}

Remove password on the private key (dangerous!). Make sure the file permissions, at least, are locked down.


openssl pkcs12 -in ${prefix}.p12 -out ${prefix}.pem

openssl rsa -in ${prefix}.key -out ${prefix}-no-password.key
chmod 400 ${prefix}-no-password.key

Create a truststore file in pem format.


echo "" > ${prefix}-truststore.pem
for trusted_alias in "${TRUSTED_ALIASES[@]}"
do
keytool -export -alias ${trusted_alias} -file trust-${trusted_alias}.crt -keystore ${STORE_LOCATION} -storepass ${STORE_PASS}
openssl x509 -inform der -in trust-${trusted_alias}.crt -out trust-${trusted_alias}.pem
cat trust-${trusted_alias}.pem >> ${prefix}-truststore.pem
done

Friday, January 17, 2014

Does the formatting of a tnsnames.ora file really make a difference?

I recently set up Oracle on my Mac so I could develop an application that uses Oracle for the database tier. I use the option of running Oracle server on a VM and configuring it to be available to my Mac using port-forwarding.

I pulled together some notes as I knew I would be going through the procedure again. Going through that procedure the second time, I encountered the following common error:

TNS:could not resolve the connect identifier specified

I checked to make sure I had set up port forwarding correctly on VirtualBox

I verified that I could telnet to the VM on port 1521 (from my Mac)

telnet 127.0.0.1 1521

I verified that I could successfully log in through sqlplus from the VM

sqlplus my_admin@orcl

I verified that the listener was up (from the VM)

lsnrctl status

I verified that my Mac had a tnsnames.ora file and that it was located at $ORACLE_HOME/network/admin/tnsnames.ora and that read permissions were open.

ORCL =
  (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = 127.0.0.1)(PORT = 1521))
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SERVICE_NAME = orcl)
    )
  )

I verified my environment was configured to properly use the sqlplus client on my Mac.

export NLS_LANG="AMERICAN_AMERICA.UTF8"
export ORACLE_HOME=/opt/oracle/instantclient
export DYLD_LIBRARY_PATH=$ORACLE_HOME

Everything seemed to be correct, but I was still getting errors. As I compared the tnsnames.ora on my one Mac that was working correctly and the other, I noticed that the one that wasn't had the entire configuration block indented. I had indented the code in my notes and so when it was copied over, the indent had remained. It couldn't really be that simple, could it?!

It could.

As soon as I removed the indentation so that the service declaration was bumped up directly with the left edge without any spaces, I could successfully connect.

Wow.