We made it back from Yosemite late last night after a tough backpacking trip. I'll post more information along with pictures and video later in the week. I'm glad to be back home. :-)
Forrest is back from Indonesia and this coming weekend we are getting a big group (18 or so) together to do a backpacking trip in Yosemite. We are going to do a 3 day, 33 mile Grand Canyon of the Tuolumne trip. You can read about it in detail here.
Not too long ago I got an Apple TV. I thought I had an HDMI cable at home already so I didn't order any with the Apple TV itself. When it came we realized that we didn't have one, so we went to the Apple Retail store at the Oakridge Mall (not far from where we live) to get one. It is a mini Apple Store so they didn't have the HDMI to HDMI cable displayed anywhere, so I asked if they had it and the Apple Store employee said yes and then got it from the back. When he returned we did the transaction on a handheld device and instead of printing out the receipt they emailed it to me. Sure enough it was waiting in my inbox when we got home. After we got home Timber did the setup and then we started playing around with the Apple TV. My first impression was that the setup was easy, all we had to do was get it on our wireless network and then it started syncing with iTunes on my MacBook. We played around with streaming video from the MacBook to the Apple TV and were happy with the results. There were no slowdowns and the video quality was pretty good on our TV (50 inch 720p HDTV). We also played around with displaying the photos that Timber took this weekend and checked out the music. Overall it is a nice device and anything I have in iTunes can now be shown on our TV. Here's a list of things I like and things I don't like.
Things I like:
Setup -> very easy to setup and start playing around with
Streaming -> I have a 56 Mbps wireless network and streaming a movie from my MacBook to the Apple TV worked flawlessly.
Video Quality -> I tried movies (more on this later), Google Videos (tech talks), TV Shows, and Video Podcasts and they all looked at least as good as non-high def TV which really surprised me. I thought the Google Videos would look like crap but they came out sharp. They were not wide screen but they were easily good enough.
Aperture Availability -> When I read the specs on the Apple TV it said that you could show your photos from iPhoto but it didn't say anything about Aperture. But it can easily show your Aperture photos which I definitely like.
Things I don't like:
Synching -> The synching was ok except for the fact that it seems to go to sleep or timeout after a while. I tried twice to have it sync everything overnight and when I checked on it in the morning it has only synce up to the first 100 songs. I may have to change some settings but I'm pretty sure I'm doing everything right.
No Caching -> I had downloaded a movie trailer from iTunes directly to the Apple TV and then accidentally hit the Menu button to go back before I could play the trailer. Then I tried to go back and watch it quickly but I had to download the entire trailer again. It should Cache that stuff.
Overall I'm pretty happy with the Apple TV thus far. In the future I would like to see it do a better job of synching, pre-load some of the top trailers and songs from iTunes, allow the user to delete Podcasts and other data directly from Apple TV, and possibly have a web browser or RSS reader. I think progress will be made over time to make this an even better device. On a side note my strategy for movies is to instead of buy them from iTunes rent them from NetFlix, rip them via HandBrake, import them into iTunes, watch them on the Apple TV, and then delete them when I'm done. Overall that system works fairly well thus far. The manual effort is pretty low, the video quality is very good, and I don't have to buy any movies.
Thanks to Adrian for this link. If you have time check out the article. It is a good rant against all the things that get in the way of shipping simple software that works. What fluff can you cut out of your software development processes?
Yeah, I'll be working remotely for a few days, but my robot will be in the office. That would be cool. These robots which are starting to be used at hospitals are a way for specialists to extend their reach. So just think about this next time you work from home...
You know you're getting old when a band that you liked when you were in high school announces that they are getting back together, with only half the band, and is going to do some concerts. In this case the band is Smashing Pumpkins and I doubt they will ever get as big as they were back in the day. That's alright, time marches on after all. At any rate it doesn't look like they will be playing at Indian casinos any time soon. When they start doing that they are past the washed up stage and have been put in the dryer.
Hot on the heels of last week's JavaOne conference is this week's IBM Academy of Technology Agile @ IBM conference. A long time ago in a galaxy far far away I submitted an abstract for a poster at this conference and it got accepted. I don't know whether or not it is IBM confidential so I won't post the slides here. But the poster is titled "Agile by any Other Name" and is about the 1 week iterations that we created for WebFountain. The abstract basically says that back when we did it we did not know that our iterative development cycle was indeed agile (by today's definition) and it listed lessons we learned along the way. So for all the WebFountain alumni out there, thanks for being agile before agile was cool. The conference goes from today through Thursday, and the cool thing about it is that all the slides are available online ahead of time and the sessions that I want to see are video streamed live and recorded. So I was at Almaden for the morning half but I'm back in my office for the afternoon half which allows me to get at least some work done while the conference is going on. That got me wondering if there have been any conferences held in SecondLife yet. I bet once you can do more video, slide shows, and audio it would be a great medium for doing that. But one of the best things about conferences is the total immersion in the subject, and do to that in SecondLife you'd have to leave that up to the attendees. But I could definitely see that happening in the near future if it hasn't happened already.
On my last day at the conference I missed the morning general session where James Gosling talked about all the cool toys out there that were written in Java. I am definitely going to download the video and watch it this weekend. You can find it here.
Comparing the Developer Experience of Java EE 5.0, Ruby on Rails, and Grails: Lessons Learned from Developing One Application::
This was a pretty good talk in that they compared the same application written in Ruby on Rails, Grails, and Java EE 5. They highlighted the frameworks to help developers decide where to invest time. They spent a little bit of time going over the background of Ruby (this was a conference after all). they covered:
1. jRuby
2. Tools
3. Rails
4. What Sun is doing with Ruby/jRuby
5. How the popularity has grown over time.
Under rails they talked about:
1. Controllers
2. Models
3. Views
4. Active Records
5. webbrick
Under grails they talked about:
1. Based on Groovy (javascript programming language, integrates java, javaspec)
2. grails integrates with JEE5.
Why do people like using these things? It is because they are simple and fast. For their demo they went building the Ruby app using NetBeans 6 and got the Ruby extensions. They also used Faban to do their performance testing.
Finally, when they showed us the performance numbers we saw this:
Rails -> Max of 2k transactions per second
Grails -> Max of 3250 transactions per second
JEE5 -> Max of 11k transactions per second
They said this was with no tuning and that doing performance benchmarking is more artistic than scientific.
Other observations while going through their project were:
1. Native Ruby outperforms jRuby.
2. NetBeans IDE was the best for developing their apps (they were able to use it to deploy all three projects.
3. They had to do some tweaking to get the NetBeans IDE project to run on native Ruby.
As far as developing things went, for Ruby it was easy to get stuff and finish it quickly, but don't touch the database once the app is created because it is very difficult to change. They had to recreate the db every time they changed the app. For Grails there was a short learning curve, useful templates, there were simple examples to follow, and it is integrated with JEE5. Grails was not cool in that there is technology from everywhere, it is light on documentation, and there is very little tools support. JEE5 was cool because of familiarity, it can generate JSF CRUD, scalability, transaction support, tools, and standards. The not so cool part about JEE5 was that it was a lot of work and scaffolding to build a simple project. Here are some of the links they referenced at the end of their presentation:
1. Damian Cooke's blog
2. Ruby on Rails
3. Opensource Community
4. Benchmarking tool
Beyond Blogging: Feeds in Action::
This was a pretty good talk about the history of feeds and how the standards have progressed to where they are now. After going through the history, the speaker, Dave Johnson, then went through the REST API and ROME. He also demonstrated how to use ROME to grab feeds and do things with them like keep track of defects or build other applications. He also mentioned a site named feedvalidator.org. Overall it was a pretty good talk.
Minimalist Testing Techniques for Enterprise Java Technology-Based Applications::
Chris Richardson gave this talk about various testing techniques for java based apps. He has authored the book, POJOS in Action. He started by going over the state of testing which is basically that not enough testing is done and that developers need to do more to ensure quality. Some of the obstacles standing in the way are:
1. Cultural (seen as extra work, many believe their code always works, having things always go smoothly isn't generally rewarded, and it is another thing on a long of new things to learn).
2. Technical Obstacles (spaghetti code, bad frameworks, devs don't always make code testable).
Projects that don't do any unit testing can expect to see lots of manual testing, more bugs, long nights, and ample delays. But projects that do have test automation will have fewer bugs, be able to write new code more easily, and will have tests as a safety net. POJOs make testing easier. One reason for slow tests is that the test code runs outside the same jvm. You can minimize test time by have fast running unit tests and use continuous integration. But one problem is that when you add more and more tests your test suite takes longer and longer to execute. If you are developing an application that must talk to others, one way to speed up the testing is to do it with mock objects. There are several ways to create mocks. They are:
1. Write your own.
2. Use a mock object framework (jMock or EasyMock).
You can make the testing with mocks part of your JUnit test runs. But there are downsides to using mock objects for testing. They are:
1. It is essentially white box testing which means that you can cheat.
2. The tests can be brittle.
3. You have to mock selectively.
Other techniques discussed were testing database connectivity by using an in memory db (HSQLDB). It is way faster and easier than setting up a separate db instance but it might not work if you have to hand code the SQL and there may be some incompatibility with various systems. Another technique when working with databases is to do rollback on transactions. This way the tests run faster and it leaves the db unchanged. But the drawbacks of having the extra speed include committime constraints are not checked and code running in a different JVM interface can't see the changes. He showed a few demos and then preached the benefits of Selenium (I knew this guy knew what he was talking about :-) ). He also mentioned Cargo, which is a cool tool that will install, start, and stop applications. It also comes as a plugin for Ant and Maven. Using Cargo & Selenium you can startup your app, run automated web tests, and then shutdown the app. On my project we use Ant, Jameleon, and Selenium to do the same thing. Lastly he mentioned ORM Unit and did some demos.
Java Puzzlers, Episode VI: The Phantom-Reference Menace/Attack of the Clone/Revenge of the Shift::
This was a pretty entertaining talk with Joshua Bloch (Lead Java Architect at Google) and William Pugh (creater of FindBugs). They had 8 puzzles in java code that they presented along with multiple choices for what each of the 8 snippets of code printed out at the end. Then the audience got to pick what we thought was the right answer and then they talked about what was wrong with the code and what we could learn from it. I only got 3 out of the 8 right which tells me that I still have some work to do. But some of the people around me got less than me right, so I don't feel too bad. The puzzles were good and the session was entertaining so I went ahead and bough Bloch's book which is also pretty good so far.
Day 3 was a busy day with a beer bash and BattleBots at the end of it.
Fun and Profit with the Google Checkout API in Java Technology::
This talk was a few guys from Google and one outsider talking about how Google Checkout works, what they are trying to accomplish, and how we as developers can integrate commerce sites with Google Checkout. Specifically they covered:
1. Checkout Service
2. Checkout API
3. Sample Java Code
4. Posting a Cart
5. Processing Notifications.
6. Adding shipping information for product tracking.
For the Google APIs they pointed us to the 32 APIs found here. They also talked through why they came out with Google Checkout and showed us stats about online sales, how they are projected to grow, and where they thought Google Checkout could add value. I can definitely agree with their conclusions based on the data showed. They then went through the benefits to Buyers and Merchants and showed detailed diagrams around the buying and shipping notifications processes and how some of the processes are synchronous and asynchronous. As it turns out there are two integration options with Google Checkout. You can simply add a button to your page that will take the Buyer to the Google Checkout so they can finish the transaction. This is known as level 1. Level 2 integration involves using the Google Checkout XML API which made it look like the Buyer can finish their transaction from within your site. As a Buyer I would prefer to stay at the same site to finish my transaction whether I'm using Google Checkout, Paypal, or a cc. Then they walked us through implementing the ubiquitous Java Pet Store using the Google Checkout API in Java. They also talked about some of the mistakes they have made along the way. Toward the end a guy from outside of Google talked about using a test product called Mendoza to test the whole transaction process. It is a simple jar file that sits as a proxy between the client system and the Google Checkout Sandbox.. It verifies the syntax going back and forth through it as well as uses Selenium to place orders. I use Selenium on my current project and highly recommend it. He then went on to talk about future improvements such as:
1. Improved UI
2. Unit Testing Tools
3. Test Suites out of the box.
Deploying and Scaling Massive Digital Archive Repositories::
This talk was given by a Sun guy and it basically covered the problem of how do you store massive amounts of data (Petabytes) in an economical way but still allow users to access the data within a decent amount of time. Some examples he used of web sites that add this amount of data were Youtube and Flickr. He said that data is basically made up of fixed data (data that never gets modified) and other data that does get modified. For fixed data think music, videos, pictures, programs that you run on your computer etc... For data that gets modified think word documents, web pages, code that you write, etc... He said that roughly 80 % of the data out there is fixed and will not be modified later and that it will, grow year after year, at an alarming rate. Then we walked through the different stages of building a massive storage system. It was kind of like playing sim city in that you start with something small and simple and as you evolve it becomes more and more complex until you finally throw in the towel. Some of the different technologies in the this space that he talked about include:
He then went on to talk about new models of storing massive amounts of data. He covered the idea of storage objects. Each object has:
1. Data
2. Searchable Metadata
3. Code that can be executed against the data.
He said that if you are going to build your own massive storage system you should build it from an existing framework and focus your efforts on your workflow. Then he went on the pitch a case study using Fedora and the Sun StorageTek 5800. I got to check out this machine at the Sun booth and it was pretty sweet.
Testing Concurrent Software::
The testing concurrent software talked about the difficulties involved in and the ways that developers should approach testing their multithreaded applications. It was no big surprise when the speakers said that the testing is difficult but not impossible. They said that concurrent software testing is like sequential software testing except there are more failure modes. Some of the failure modes included deadlock, livelock, and missed signals. Some bugs require the stars to align the right way and are very difficult to reproduce. I know this well from experience :-). There is nothing quite as fun as running a test, seeing a problem, spending the time documenting the problem and looking at the log files and then trying to reproduce it before going to the developer and then not being able to reproduce it. They said that for concurrent software you need more intensive tests that run for longer and look for rare probabalistic failures. They also recommended separating your concurrent logic from all your other logic as much as possible. They also made the assertion, which I agree with, that testing will never find all the bugs. Instead it increased the confidence you have that your application will work. There are several areas of testing. They include:
1. Manual Code Review (expensive because it requires subject experts to be effective).
2. Static Analysis
3. Unit Test
4. Unit Test under constant stress
5. Performance Testing
6. Stress Testing
Other testing items they covered included TestNG (concurrent support), the Marmoset Project. findbugs, testing for race conditions, testing for bounded buffers, Con Test, JMX API, and documenting what you need the product to do, and system integration testing. You can also find a list of all his publications here.
RubyTooling: State of the Art::
This talk focused on different IDEs, editors, and debuggers for Ruby. Overall I didn't find it that interesting because it focused on the lack of good, robust tools for Ruby development. Some links they mentioned were rubyforge.org and datanoise.com.
Write a 3-D Game in the Java Programming Language in Less Than 50 Minutes::
This was a pretty fun talk that focused writing a simple 3-D Game in java in less than an hour. In reality, you could "write the code" for the game in less than an hour, but it would take you way more than an hour to get all the tools in place for you to get your game going. It was fun and entertaining though and the Demo game was Duke (the java mascot) running around shooting coffee beans. :-)
After all the sessions were over they had an after dark bash complete with BattleBots, a midget band playing Kiss songs (I'm not kidding), and lots of food, drinking, and games. Overall it was pretty entertaining and the BattleBots fights were pretty cool. Some of them were even driven by Sun executives.
My second day at the JavaOne Conference was a pretty good one. I took a later train (more sleep) and got up there with some of my colleagues just as the breakfast area was closing down. We got a quick bite to eat before heading off to our technical sessions.
Creating Amazing Web Interfaces with Ajax::
Two guys from Ajaxian.com gave a talk about creating cool web apps using Ajax. They went through lessons learned for several Ajax projects. They were:
1. Google Maps
2. Housing Maps
3. Google Suggest
4. Ta-da Lists
While talking about the examples and lessons learned they also mentioned several Ajax frameworks (openlaszlo, Ext Js, and Dojo just to name a few.
They also went through a few case studies, some with sample code. They were:
1. Moxie (offline support)
2. Y! Pipes
3. Enhancing Ajaxian.com
Moxie (not public yet) was pretty cool because it offers offline support which they say will become a big thing going into 2008. They also briefly mentioned Adobe Apollo. They mentioned the Y! Pipes is pure Dhtml and it uses the canvas feature. For Ajaxian.com they walked through examples to improve community participation by dynamically updating an article page when a person submits a comment, adding more fonts for better typography, and making the registration part much more user friendly. They also mentioned using Firebug for debugging and scriptaculous for lots of cool stuff. Lastly they mentioned future stuff:
1. Apollo
2. Offline support for Ajax apps.
3. Abundant custom rendering.
4. Microformats.
5. Fast javascript
6. "wow" factors
7. html5
jMaki: Web 2.0 App Building Made Easy::
jMaki is basically a javascript wrapper. Since it is a wrapper a good question is to ask why use it? The answer was:
1. Convention over configuration.
2. generic component libraries.
3. Tooling Support.
4. Standardized the data model.
5. Normalized javascript technology toolkit apis.
6. Server/Client Integration (standardize java code/php to javascript programming language on the client system.
It also has phobos(server side javascript) support. They have a jMaki plugin for NetBeans if you want to use it. They went through and demonstrated building a page with several widgets pretty quickly. Some of the data models they showed were tabbed views, trees, accordions, menus, and datagrids/tables. I think the best part about using jMaki within NetBeans to create Ajax applications was that you can easily use just about any Ajax toolkit out there all together in one place. So you can use a dojo widget along with widgets from any other toolkit on the same page quickly and easily.
Fast, Beautiful, Easy: Pick Three--Building Web User Interfaces in the Java Programming Language with Google Web Toolkit::
You can find the GWT here. The guys from Google talked the GWT and how it can be used to really speed up the development and test cycle when writing Ajax apps. I thought it was particularly well paced at the JavaOne conference because you write the code java and have the toolkit transform it into javascript. They also did the kitchen sink demo and spent some time talking about the principles behind Google web development (speed, ease of use, etc...). Overall it was an alright talk and it highlighted a different between Google and others in the Ajax community on how web apps should behave. Google and some others think that the user should remember that they are on the web (be able to use the back button, etc...) but that the application should be easy to use and fast. Others lean more toward trying to make the user forget they are on the web and make the Ajax apps look and feel like native desktop apps. I would tend to lean toward the latter, but who knows I may feel different by this time next year.
Stress Your Web App Before It Stresses You: Tools and Techniques for Extreme Web Testing::
This talk was given by a couple of test guys and they really went into great detail about what stress testing means and why you should do stress testing. It was a pretty good talk and they also did a demo (with lots of technical difficulties) using JMeter. JMeter is pretty cool and I'm definitely going to put in on a couple of boxes in the office and use it to pound on OYE to see if it does a better job than the other performance tool we've been using. Some important tips they gave for doing stress testing were:
1. Avoid testing the most used scenarios.
2. Pay special attention to user definition (who is using the system, how are they using it, what are they likely to do to the system, etc...).
3. Use real input data from real users wherever possible instead of using synthetic data.
Good advice and a good talk.
Effective Concurrency for the Java Platform::
This talk was about how to walk the fine line when writing java code and trying to avoid race conditions, deadlocks, and scalability bottlenecks. The speaker was Brian Goetz. He basically said that when you are trying to write thread safe code you want to get back to the basics (which some of us sometimes forget). He said it starts by documenting (using @ThreadSafe, @NotThreadSafe) whether or not something is thread safe at the class level. He said you should also document how you made it thread safe (because we sometimes forget these things later). Hey also went through drawing diagrams to figure out the synchronization policy. Then he went on to talk about encapsulation and how if you try to do it at too low a level in the code you get race conditions while if you try to do it too high the code becomes fragile. He also talked about why he prefers immutable objects because the state cannot change after construction, all fields are final, they are automatically thread safe, they are simpler, safer, and most scalable. Finally he mentioned Amdahl's law while talking about how to take advantage of multiple core machines while trying to speed up and scale out your java code. Overall not a bad talk.
As the first day of the conference I had to get there, get registered and try to figure out where the hell I am supposed to go. This all happened today. :-) I made it into the general session a little after 8:30 and I got to hear Rich Green, executive vice president of software for Sun Microsystems give a pretty good talk and have a few demos done. He first brought up someone from Ericsson who talked about their work with Sun to open source the Sun Java System Communications Application Server, an application server complete with a multimedia information management system (IMS). They even showed a video about a family (Mother, Father, bratty teen) all using their cell phones and mobile devices to essentially ignore each other. I really like the part where the mom was writing a text message while driving. Talk about encouraging safe driving habits. Another good part was where a kid was trying to watch a rated R movie on their laptop and the laptop automatically text messaged the mom and dad asking if it was ok for the kid to watch that movie. They said no and the kid looked bummed out and closed the laptop lid. Yeah right. Keep dreaming... They also announced JavaFX which is essentially a new scripting language that do most of what you can do with AJAX or Flash. It has other benefits that you can read about here. I'll play around with it but I've got a few other scripting languages on my list to learn before I try to do any serious work with it. My favorite part of the opening keynote was having Scott McNeely talk about a new educational website that would eventually have all the curriculum from k - 12 for free. Not suprisingly it is www.curriki.org. I guess I just have a soft spot for using technology to try to educate kids and give them more oppurtunities to learn. As for the regular sessions here are the sessions along with some of my notes:
Packaging JavaApps For Ubuntu::
Nothing too earth shattering here, it just went over how you get your java application packaged into Ubuntu's repository so people can install it very easily (sudo apt-get install myapplication). They used GlassFish as an example along with lessons learned while getting it ready.
The basic rules of thumb are:
1. Decide the number of packages.
2. Choose your license.
3. Identify component to deliver to.
4. Identify dependencies (both build time and runtime).
Some of the lessions learned by the GlassFish team were:
1. Break software into discrete components.
2. Figure out licensing.
3. Package all build dependencies.
4. Don't rely on graphical setup tools.
5. Build package for default using Ant.
Overall I thought it was useful because at some point I would like to package IBM OmniFind Yahoo! Edition to be easily installed and updated within Ubuntu. I'll probably just have to do all the work and just ask for permission to deploy it instead of asking if I can work on it. Such is life.
Web Algorithms::
Overall a cool and entertaining speech covering 5 handy dandy web algorithms. They are:
1. XOR Swap
2. Credit Card Validation
3. Public Key Cryptography
4. Two's Complement
5. Google Map Reduce
The speaker went through examples of each along with walking through code snippets in java and other programming languages. All in all it was a fun geekfest.
A Step Along the Way: Using AJAX, Portals, and Services for Better Network Management::
This one was probably the most boring even though it didn't need to be. It was a talk about how Nortel is using AJAX, portals, and SOA infrastructure to create applications that allow them to more easily manage their network infrastructure. Cool from a functionality perspective but not super exciting. The speaker didn't help much since he spoke softly and in monotone the whole time.
The JavaOne Conference starts in SF today. I get to go to it along with a few other colleagues from my group starting tomorrow and going through Friday. The good part is that it is chock full of talks and events that sound like they are going to be fantastic. The bad news is that I have to commute to SF and back each day. I'll take notes and post them each day (if time allows) and I'll also post them internally for my team. If you are going to the conference and want to meet up and geek out about all things java drop me a line.
I was reading this tutorial the other day by the Stuck in Customs guy and decided to try doing some HDR photography myself. I won't go through all the details because Trey does a good job of explaining it all in his tutorial, but here are the basics:
1. HDR stands for High Dynamic Range which means that you have higher variation between lights and darks than in normal photos.
2. In general the best way to do this is to use a camera (with a tripod) to take several of the same shots with varying exposures. Some of the shots should be over-exposed while others should be under-exposed.
3. Use Software (in this case Photomatix) to slap all the shots together and to make your end product look really cool.
So I gave it a try when we were in Yosemite and here are the resulting shots. I think the technique is pretty cool and I'm pretty happy with the results, but I don't see myself making this my signature technique so I probably won't fork over the $99 for the software to do this with.
I forget where I read this idea, so I can't cite it, but it is out there somewhere. Someone wrote an article about how to get a good photo of a popular monument or location without having all the people show up. You've probably been there yourself. You get to a great spot and want to take a great picture of a the steps of Rome, but people keep getting in the way, it seems like they are ants crawling all over the place and you're mad because you can't get a decent picture of the steps and fountain themselves. That happened to us when we hike to Delicate Arch in Utah a few years ago. Anyway, the article said that if you used a tripod (the best thing since sliced bread) you can set it up and take several shots of the same scene (use your discretion as to how many shots to take) and then when you get home use photoshop to use layers and blend together several photos to get one photo where no people are in it. This assumes that people will move around and you will be able to erase them by going through the layers of your different shots. It is a simple yet brilliant idea. I decided to do the != experiment when we were in Yosemite. Timber and I wanted to get as high up and close to Glacier Point as we could so that we could keep Wookie cool but the road from highway 41 turning to head up to Glacier Point was closed. So we parked my truck and walked up Glacier Point Road a little more than a mile. Along the way I got the idea to set up my tripod in the middle of the road and take several pictures of Timber and Wookie walking up the road. Then when we got home I used the layering technique except this time I used it to make it look like the road was filled up with several Timbers and Wookies. The photo attached to this blog posting is the final result. I only spent about 15 minutes doing this because it was an experiment but I think the technique worked. Please kids, try this at home.
I finally posted the pictures we took last weekend here. The ones I find the most interesting are the night shots from within YNP. The sky was clear and there was nearly a full moon which really helped us get some decent shots of the granite and waterfalls. The photos were done using a tripod and long exposure, usually leaving the shutter open for between 20 and 30 seconds. I'll try to do more of that on my next night trip to the great outdoors.
On Sunday, our last day we checked out the Mariposa Courthouse (erected in 1854) on our way out of town. We headed for Oakhurst and then north from there on highway 41 so that we could do one last hike. We went roughly 6 miles out of town and then turned on a road so we could do the 'Shadow of the Giants' trail in Sierra National Forest. We did that because dogs are allowed on that trail and we figured (rightly) that it would be as busy as the big tree hikes inside Yosemite. It was a nice trail and we got a few nice pictures. Wookie appreciated being able to pee on some of the larget living things on earth. :-) After the hike we headed home in a fairly uneventful drive. I'll make a map of our trip later and post it.
On Day 2 of our trip (Saturday) we first did an out of the way hike that a local told us about. It was north out of Mariposa but not quite as far as Hites Cove. In this hike we were much closer to the Merced River and there were some nice wildflowers. We also let Wookie off his leash so he could run around and be silly. When it started to warm up he began to run from shady spot to shady spot and then waiting for us to catch up. When we would catch up to him he would then get up and run to the next shady sport and promptly lay down again. After our hike we got cleaned up and then headed into Yosemite. We drove around the valley area and took a few pictures before heading up to the road to Glacier Point. When we got there the entire road was closed which was a real bummer. So we parked and then hiked up the road a ways with Wookie. After that hike we went down to Curry Village and had some pizza and beer. After dinner and on our way out of Yosemite we stopped in a few spots to take night time photos of the granite and waterfalls. The moon was nearly full and the sky was clear, but it was difficult to get clear shots at the landmarks because of all the smoke rising from campfires. I'm not exaggerating when I say that it was like a fog. Unbelievable. At any rate I think we got a few good shots which I will post later and talk about in more detail.
This past weekend Timber, Wookie, and I did a little trip to Yosemite and the surrounding area. We used Mariposa as our basecamp. Since Timber didn't feel like camping we stayed at the Comfort Inn which was a decent hotel, was dog friendly, and had free wifi (bonus). On Friday we hiked the Hites Cove trail which is fairly well known for being a great wildflower trail. It is roughly 22 miles north on Highway 140 going towards Yosemite National Park. We took lots of photos and I wish we had a macro lens for that hike because the flowers were really nice. Lots of varieties to be seen. After the hike we returned to town to get cleaned up and have dinner. While we were in Mariposa we ate at the Pizza Factory which had decent pizza and is a hangout for the locals. We also ate at Happy Diner which has the largest menu selection in the Sierras. They are true to their word, at least from what I've seen thus far. They had good food and the prices were decent. They also had a spot designated for people to charge their portable electronics which I thought was innovative of them. Check them out if you're in Mariposa and are hungry.
I was happy to see the Sharks get their act together in the second half of last nights game and grind out a dubya against the Red Wings. The Wings started out strong taking a 1-0 lead and really getting the Sharks back on the heels. Nabby kept us alive by making unbelievable saves throughough the game and particularly in the beginning. A great goalkeeper is worth his weight in gold in any sport, but particularly in hockey. After a while Ron Wilson shuffled his lines and got some things going. We took some dumb penalties but overall we finished our checks and didn't make too many dumb mistakes. I'd say things are looking up for the Sharks but I still think it is going to be a 7 game series.