Wednesday, December 26, 2007

Sponge-Worthy: Tips for obtaining Free PDUs for Continuing PMP Certification

Lo and Behold... People... I'm PMI re-certified (or almost there at least).

I completed my PMI certification in April 2004 and was expected to collect 60 PDUs in next 3 years to remain a PMI certified. Happy to share with you that I managed to acquire, report and obtain approval for 64 PDUs for the period of 2004 - 2007 and I get to carry over 4 PDUs for the next cycle. Only just I should say... razor thin gap and in time for my annual appraisal.

Next part is most important and free loader like me really enjoy - all the PDUs were obtained without spending any money other than time and fuel to travel. Honestly I did learn quite a few things on the way, which I would not have, if getting the PDUs were not a goal. So, thank you PMI for motivating me. (and of course, my employer - Software Paradigms International)

Also, getting PDUs is not very hard. It's matter of tracking it properly and then remembering to report it correctly.

In the next few paras, I'll talk about how I went about doing it and then in the next section, I will offer more avenues, I plan to explore for next cycle (and you can too!)

Being what you are: 15 PDUs

Good news... if you are project manager or play a role in managing projects. One-quarter of your work is already done.

Being a PMP certified Project Management Practitioner, I earned 5 PDUs every year. So from year 2004 - 2007, I earned 15 PDUs.

Report that under category - 2H (Practitioner of Professional Project Management Services)


Sharing your knowledge: 30 PDUs

Become a trainer, a mentor and/or a speaker on Project Management and PMP certification. You get high marks for this. Sharing and spreading the skills and knowledge of Project Management is important part of being a PMP.

I earned that as :

(a) A guest speaker on Project Management Professional training class
I spoke on how I prepared for PMP exam and took up the topic of Project Scheduling and Critical Path Calculation with the participants. (Special Credits to my colleague - Jitendra Ram here inviting me to be a guest speaker) - 10 Points

(b) Trainer for Microsoft Project 2003
I extensively use Microsoft Project for planning and tracking project resources and schedules, in addition to my employer's own Enterprise Project Management tool - spiProject. But I was surprised to learn not many project managers use MS Project effectively or if at all.

So, using some real life examples that I use on my project and armed with some advanced techniques in Microsoft Project, I trained my peers in use of MS Project. - 10 points

(c) A presenter on Project Management topic at the company strategy meeting.
My employer is in strategic partnership with several channel partners and it place special level of responsibility on end customer facing resources. My topic was the common pitfalls, I faced during my assignment and how I overcame that and specifically highlight how being a PMP and specific project management techniques helped. - 10 points


Reading and sharing - Self Learning: 15 PDUs
It helps if you are a avid reader and a good notes taker.

Over the past of couple of years, I read on project management and business. Notable of there being
- World Is Flat: A Brief History of the Twenty-first Century by Thomas L. Friedman
- Reimagine!: Business Excellence in a Disruptive Age by Tom Peters.

Both are excellent reads and deservingly best sellers. It is possible to relate these books to several key areas in project management especially communication, resource management and risk.


Free Seminars and Local PMI Chapter Dinner Meetings: 4 PDUs

Both these events are excellent sources of earning PDUs.

I must admit about PMI Atlanta dinner, I did get lucky. My employer was the gold sponsor for the PMI monthly dinner and I got the benefit of that to attend one of most interesting seminars (How to be a better consultant/Carl Pritchard ) and show off my sales acumen. - 1 PDU.

The second seminar I attended was sponsored by Dekker Ltd. It was a half day course on Earned Value Management and how to apply MS Project and software tools from Dekker to it. It was long drive to south side of Atlanta for me, but it was worth it. Check out their website for more half day seminars.

The ones, I missed

Now, these are events, I missed reporting because I did not keep good notes on them.

I guess that is lessons learned for next time.

Other Avenues to try
I plan to explore these avenues this cycle as well the tried and tested ones too.

  • Free Webinars: Nothing better than to learn sitting at your desk. There are several vendors offering these free webinars in order to entice new customers for their products or services.
  • Volunteering Opportunities: Find some at your local PMI chapters.

Of course, as always... .START EARLY.... !!!

That always helps to cut down on the last minute rush. Of course, you would know the benefit of planning, after all you are PMP...:)

Hope you find this blog entry shared something useful for you and should get you started on the PDU hunt...

GOOD LUCK!


Sunday, December 02, 2007

WAMPing now: My First Pure Open Source project

I have been working with Microsoft .NET technologies for a while (6 years now), so it is easy for people to assume that that live and breath... in CLR (Common Language Runtime) world... :D and probably despise the JVM (Java Virtual Machine) planets and all the revolves around OSS (Open Source Software) planet systems.

Well the truth is, I do... do lots of my day-to-day work using .NET because that's the assignments I have been put on to win my "daily" bread. I use lots of Open Source .NET technologies like NAnt, NUnit, NHibernate to name a few, which has their in the Java/OSS world. Add to the fact, I did lots of J2ME and J2EE work before moving over to Microsoft's .NET platform.

Besides, it always pays to keep an eye out to the "dishes on the other table".

Dishes on the other table being - Java and J2EE being the obvious choice, but there are lot of open source technology to watch out for including dynamic languages and their associated frameworks like Ruby, PHP, Perl etc.

So, late last week - I delivered my first production ready - xAMP product, which I hope will go live in next couple of weeks.

xAMP stands for:

x - OS of choice.
A - Apache Web Server
M - MySQL database
P - PHP language

The two most known flavors of xAMP - LAMP (Linux based AMP) and WAMP (Windows based AMP). Honestly, about two weeks prior to completing the project (Delivery date: Nov/30/2007) I did not know much about MySQL and had limited knowledge on PHP. But Being a Linux enthusiast, I have played around Apache technologies since college (1997-98).

Anyway the point being I was amazed at how mature the tool set around xAMP platform is and how easy it is pick up.

The development environment I used was Microsoft Vista Home Premium Edition. I feel that Home Premium is more ideal of the Vista platforms as it did not have the IIS Server installed. So, I had to install "A" of the xAMP platform - Apache Web server. Of course, it is not as dysfunctional as Home BASIC and that is just my humble opinion. Also, add to the fact that I did not have much choice. All the other computers at my disposal had issues of HDD space or were just not powerful enough support development tools.

The Vista platform comes with a caveat though and that is its famous User Access Control feature (UAC), which is incorrigibly mistrusting and highly irritating. After much bruises and cuts fighting UAC, I was forced to turn it OFF.

So note to self: Real developers don't need UAC unless you are developing for the Vista box.

That was probably the only glitch I faced in setting up the development tools. (Touch wood!)

The development tool of choice - Eclipse Europa (v3.2 + Goodies in September/October 2007). Downloaded from a link on the IBM website. Additionally downloaded - PHP Development Tools (PDT) add in.

The next step was to download PHP and configure it with the Apache server. Thankfully again this step is well documented on the web. Tons of site offering advice and troubleshooting guidance. Fortunately, I had to face none and nothing in Vista OS conflicted either.

Now for the missing "M". No James Bond story here. (Read "M" is 007's secretive boss). MySQL website was the place to go. I recommend downloading the Community Edition as well as the GUI tool for it.

Also required is the glue between MySQL and PHP - the MySQL PHP connector. That is also available on the MySQL website.

With my limited knowledge of MySQL, I went with default settings. But by the time I got around to building the third table (the first two being lookup value tables), I realized that ISAM engine for the tables will not be correct as it did not support foreign key constraints (FK).

Coming from a SQL Server and Oracle background, I had taken the FK constraint for granted in all databases. Now MySQL does support FKs but only in certain Engines.

Also the idea of being able to use different engines to access and store the table was new to me. It really depends on the type of application that is being planned but this is another level of optimization available to the users in MySQL. I still have to research if MySQL can have tables stored with different engines interact. (Any answers from readers are welcome). But once I shifted the tables to InnoDB Engine, I was in a familiar world.

So off I went developing the application in about 5 -6 days. Honestly, I had made it hard of myself as I had been procrastinating and sidetracked on the project for over 6 weeks and started working on it very close to the deadline.

All said and done, the project was up and running, the user acceptance testing and the demo to the stakeholders went really well.

The application has been delivered with installation instructions from ground up. Just waiting for the application to go live in next couple of weeks and get the mullah... :D

Just to keep the record straight, knowledge of legacy ASP (Active Server Pages) and JSP (Java Server Pages) did help in learning PHP faster as it follows a similar syntax and embedded HTML layout.

I would recommend xAMP platform anybody who is transitioning from a non web background to a web based development. Tools are easy and free to obtain and lots of readily available Internet resources being the major reasons.

Please feel free to drop line, if you have any questions/comments/suggestions/brickbats.

Till next time...hopefully with a story on Ruby...



Footnote - Some Important Links:


Wednesday, November 21, 2007

Two technology sparks - Android and Kindle

The two technology buzzword which are catching my attention are - Android - Google's mobile computing platform, API or much media hyped - "G-Phone" concoction and Kindle - the e-book tool from Amazon.com.

On Android...

What excites and has been much written about and spoken about Android is that it is truly a mobile platform from ground up.

Think about it, mobile tools or even platforms bring what you can do on your desktop/laptop PC to a device which fits in your hand. Android a fresh approach, it is offering access to specific mobile device capabilites like - SMS messaging, placing a call etc. and challanging the developers to build applications on them.

Of course, the motivation from Google is worth a cool USD 10 million, of course to be shared with multiple application developers who qualify for their challange of bringing the most compelling applications built using Android. Boy... I am definetly going to try out for that one... Now if only an idea of a killer application would just KINDLE.... :)

Yep, KINDLE it is... that's the name of Amazon's new e-book device. Priced at a USD 399.00, it is an expensive Christmas gift to ask my wife for, but it will be true geeky gift.

After 3 years of research and development (R&D), here is a device which seamless integrated book reading and Internet wireless connectivity. Hold the library of books in palm of your hand but of course make sure it is charged up. Order the books right from KINDLE. And it features - e-Paper... the truly revolutionary concept where the ink on the paper is re-arranged in the blink of an eye.

What you save on is the cost of the books. While a hard cover version of a New York Time best seller costs - anywhere from $30 - $40. The same version is e-book is typically available for $10 or less.

Sunday, October 28, 2007

Bot Nets and Application Security

I have been an avid reader of science fiction. Issac Asimov and the Foundation Series has been my most favorite author and series. Now how is it related to this blog entry... because what I read in an e-Week today is literally out of science fiction...

The story titled.... "Storm Worm Botnet Lobotomizing Anti-Virus Programs" (Click here to read the article on e-Week)

The part which was scary enough was what the bot net called the "Storm" is able to do to Antivirus applications and other background processes like P2P programs but the scarier part was its ability to change is own signature every 30 minutes and its ability to detect intrusion by researchers and invoke attack against them!!!

If you ever imagined of virtual mafia,,, this is it. The trouble is that most of the owners of the machines participating in the botnet are innocent victims who fell prey to promise of free software or other incentives.

From the e-week article, I judge (hopefully correctly) that the botnet machines are primarily Microsoft Windows based machines but all machines are vulnerable to the Distributed Denial of Service (DDoS) attacks the Botnet unleashes.

There are few documented cases of e-extortion against popular websites and the e-Week article talks about a take down of an Israel based security company.

Now this has been an eye opener to what an application can face once it is put out in the wild. Thankfully, most ISPs today are equipped to handle DDoS attacks to some degree.

Application Security to most software designers today is an after thought. With composite applications and mash-ups becoming the corner stone of Web 2.0 applications in next few months, I believe application security deserves another look and should become one of the primary consideration with performance and usability while architecting any new applications.

Sunday, September 30, 2007

Challanging Status Quo and Execution...

It is always nice to hear words like - "Challenging status quo..." in a mission statement, not many organizations actually take on the feat. Understandably so. There is always a risk in venturing into uncharted waters and success in one market does not always guarantee success in another.


The difference in having a grand vision and failed vision is really the - EXECUTION.



Let's turn to some successes in a field which I am not really familiar with but I am intrigued because the business model is dependent on the goodness of human nature. Hard to come by huh...?



The industry is - MICRO CREDIT or MICRO FINANCE. I started reading about its success in the possibly one of the poorest countries on earth - BANGLADESH. You expect with the scarcity of financial resources, there will be the obvious human nature to grab and run.



Well, that's what everybody thought, till Prof. Muhammad Yunus and Grameen Bank came along. Grameen bank serves the poor of the poor in rural Bangladesh. Uses an unheard of strategy - "Lending without collateral". Well, I can safely assume that since the individual loans were not very large, it would be less risky and the chances are the borrower would not have anything put as collateral anyway. But add up several thousands on those loans... the risk becomes enormous.



There are other strategy which Professor Yunus applied, which were so different from conventional banks, like focusing on women as borrowers than men... [Read more...]



So that was Bangladesh, but would the same hold true for some other market. I recently read in Atlanta Constitution Journal, about a success of Micro lending in Guatemala.



Ok... Ok... now where is the catch? You know, I will extend this example to IT field.


The points - I carry over from Micro Credit is - trusting the goodness of human nature and applying success from one market to another.

Offshore/Near shoring has proved beyond doubt that it is possible to perform software development work anywhere and anytime.



But it is also a sad truth is the same industry that lot of people have to work long hours. Some of it to obtain experience, gain customer confidence and business and explore new technologies but some because of lack of better term - "projects that could have been better managed". Having said that, it is nature of software development. Progressive elaboration is the inherent gene. The more people see, the more people want to make it better. So it is not always in the control of the stakeholders that software developers directly interact with.


The industry has worked hard to find solution for this issue as the software systems get more and more complicated. Instead of being an monolithic application in control of a single organization, we are now looking at the age of mish-mash, landing pages and composite applications. But that is a topic for a different day.


So back to the mantra of "work life balance". One of the enabler for work life balance is - "Telecommuting". There is no work in the world better suited for this enabler than - Software Development.


Now, back to my argument goodness of human nature. The belief more often than not, people will try to do the right thing.


The argument, if it works in western countries, it MAY not WORK in an offshore environment. My argument - is to believe in the GEEKy nature of software development. A Geek is a geek not matter where s/he is let loose. A geek takes pleasure in getting things accomplished and doing it well.



Software Development is a field of knowledge workers. It is in the interest of the a software development organization willing to grow to attract and retain the best of the lot and for the right reasons.


Now that we have a VISION of a global diverse and dispersed network of technology gurus, let's looking a possible execution model. Strangely enough, it is nothing new and is not alien to companies with ISO:9001 and CMMi level compliance.



It is the use of METRICS. Establish performance metrics and clear guidelines for telecommuting employees. For example:



  • The employee must a proven and establish record of on time delivery and accurate reporting.

  • The employee already must have high speed Internet access at home which s/he is willing to use for business purposes.

  • The employee must possess a computer capable of remotely accessing the employee's desktop at work. The actual work will be done on the desktop at work.

  • If the developer running late on a task on a weekly basis, s/he is prohibited from telecommuting.

  • Allow limited days in a week for telecommuting as a starter.
So on and so forth...


On the flip side, there is a investment on the side of the organization in terms of raising the level of infrastructure. But what is gains is more productive employees in return.



From end of the employee, it is ability to save on a long commute time and ability to address family needs at times. The caveat is that the individual has to see this an incentive and not as a way to make the employee work longer hours. Also, the individual has to drop the sense of entitlement like getting company provided Internet connection just for telecommuting purposes.

All deals are based on give and take as long both sides understand the benefits.

Has it been done before? Yes, absolutely.

IBM touts of an global workforce which is built on flex hours and telecommuting. Read about the success at BestBuy.com, the online division of the electronics retailer in the US. (Read more on... "Smashing the clock"; Cover Story; Business Week - Dec 11, 2006)



If not true, anywhere else... true in India. Most women have tendency to take a break work in life changing situation like - marriage or having a child.

Though, whole premise of "work life balance", is giving priority to the family. But what if the organization is able to bring the talent back on no matter wherever in the world she is. It is a win-win situation for the employees and the organization.

Finally, a statistic recently published on concentration of IT/Software Development in India on rediff.com in the article - IT in India: Big successes, bigger gaps

  • Seven cities accounted for a whopping 95 per cent -- Bangalore (33 per cent), the National Capital Region (15 per cent), Chennai (14 per cent), Hyderabad (13 per cent), Pune (10 per cent), Navi Mumbai (8 per cent) and Kolkata (2 per cent).


The state of infrastructure in some of these cities may be something to write home about. But I can guarantee, if organizations based in these cities have probably already established telecommuting options for their workforce.

So for organizations, who haven't I would say wake up, before the heat gets turned on and this benefit becomes so... 2007.

Thursday, September 27, 2007

Sliverlight and Google Gears and AIR

Once again, it seems to the battle of the giants. Two fronts - Google vs. Microsoft. Competing technologies - Google Gears vs. Silverlight.

Both being browser based plug-ins to enable rich and compelling cross platform client experience. The grand vision is to use the web and desktop so seamlessly that the boundaries get blurred. But looking closely that's where the commonality ends.

Microsoft Silvelight(tm) seems to be more targeted towards delivering the rich client interface, a easy programming paradigm offered by Windows Presentation Framework (WPF) to the browser. Currently version 1.0 and 1.1 in pre-alpha, the vision is expected to be delivered to early 2008. Having said that, the developer tools available to enable Silverlight comes with Microsoft Visual Studio 2008 or a separately licensed tool as Microsoft Expression. Of course, this being true as I understand it now. Things may change as we get closer to release.

Google Gears on the other hand is shot at the basic limitation of a web hosted/delivered application. The limitation being, that web application DO NOT work offline. Google Gears provides components such as LocalServer, Database and WorkerPool to make the experience of being online or offline indistinguishable.

There is a third front too... from Adobe. While Microsoft Silverlight is intended to launch attacks into Adobe Flash (or latest avatar Flex) niche of compelling graphic content on the web, Abode has its eyes set on helping their loyal user base capture the desktop with AIR (Adobe Integrated Runtime).


All of these technologies use AJAX in some form or another and are founded on delivering updated content from the web.

It remains to be seen if any one of the destined to dominate the web or the developer community will find a way to leverage all these technologies to benefits the user. I can very well imagine an application which will user XAML to deliver content to AIR and where Google Gears is used to store local information which syncs with the backend when connected. But can the competitiors?

Tuesday, May 08, 2007

Windows Vista and Slow Internet Connections

On April 24 2007, I brought a new laptop (Acer Aspire 5100) pre-loaded with Microsoft Vista - Home Premium edition from my local Circuit City store for $669.00 + 6% tax. There was a $100.00 mail-in-rebate to make the deal sweeter.

As a previous owner of a Acer products (Acer Aspire 3000 laptop in 2005 and 19" LCD monitor this February), I found the products to have good performance.

I must admit, I see to have the incredible (bad) luck of being stuck with a lemons first time around when it comes to buying Acer laptops. Fortunately for me, Circuit City readily exchanged them with some inspection and replacements have worked very well. (Let me know, there is anybody out there that shares the same experience.)

First time, it was hard disk crash within 24 hours after the purchase. This time around it was the Wireless Ethernet Card. It kept dropping connections several times within span of a minute or so. Finally after a week of fighting the beast with replacing drivers, configure settings, I gave up and got a replacement from Circuit City, this weekend.

The wireless connection in the new laptop stayed strong but the internet connection was dragging its feet like a snail bearing a 2-ton iron core. Some websites just did not come up at times - like google.com or msn.com. Both IE and Firefox exhibited the same behaviour.

Add to frustration the machines with Windows XP and Linux ran just fine. After much research on the web the blame game continued on the cause. Possible suspects: the preloaded Norton AntiVirus and Internet Security because of phishing filters and double firewalls and IPv6 configuration conflicts. Turning them off including Vista's own security features did not help.

Though the problem looked like a typical DNS issues but I was not willing to accept reasoning as a cause for this. Two different machines with two different OSes on the same network were working just fine.

After two long nights, I decided to take the route to configure DNS server entries manually. I looked at the auto configuration DNS provided by the router and it included just the IP address of the gateway (or the router itself).

I decided to manually configure the DNS entries which were provided by ISP on the laptop itself and Voila !!! It started to working like a breeze.

So in summary -

Symptoms:
Slow Internet connection on Microsoft Vista (in my case Home Premium Edition).

Possible Cause in my case:
My wireless router was overloaded with DNS requests and could not service them all.

Additional causes
Multiple Phishing filters and firewalls.

Solution:
Check using ipconfig/all command on Vista command prompt to check the DNS server entry. If you only see an single entry for the wireless router, it is time to manually configure the DNS entries for the Wirless Card. You can do that by configuring the TCP/IP v4 settings.

I have additionally disabled the IPv6 support on the Wireless card and have the Norton Phishing filter off. They did not make any difference without the DSN entries.

This worked for me and hopefully, it should work for you as well.

Tuesday, April 03, 2007

War of the third kind - Reflection - 4 years later

I don't consider myself much of a technology prophet, but I found this document which I shared with the top management of my employer about - 4 years back...

Dated: July 14, 2003

Wars of the third kind

The signs

Though it may not be so evident today, may be not in coming weeks, but it’s in the horizons. All signs point to another war of computer architecture. The common factor as it has been in most of times, in the recent – Microsoft (and few others) vs. rest.

This time, it will be in the area of “on-demand” computing.


One Microsoft way, server virtualization

Microsoft (recent acquisition products from Connectix Corp) and VMware are pushing the concept of server virtualization. The concept is one powerful machine (usually multi-processor system); many servers.

Bring up servers with blink of the eye. Need a mail server, Boom! It’s there. Need a web application server, Boom! It’s there. To naked eye one machine. To the network monitor different two servers each with its own network stack, it own resources – memory and hard disk.

The reason, the powerful machine, if used as a single server, is under utilized. Why not just more applications on the server? What happens when you want the reliability of Linux for your web server and ease of Microsoft Active directory services to manage users? What about some legacy application so critical to the organization running on Windows NT?

Think! Grid computing

The idea is “super computing power at the cost of peanuts” from IBM, PolyServe and others. The method, several machines networked to give the feeling of a single all powerful mega-server. The concept has been around for a while and implemented to varying success. But selling MIPS as commodity on demand is a slightly new one. With increased supports for more operating systems and protocols this looks now looking as viable option.

The conclusion

War as it is… the front lines yet to be drawn, pitches yet to be made. But one size does not fit all as many would let us believe. Soon, we as users will make choices and it may well be combination of both. How about a grid computing network supporting server virtualization? Now that’s triple sundae with cherry on top.

Thursday, March 08, 2007

Ahh..Power of Dual Monitors

It was long, hard fought battle, but finally I persisted. I managed to buy a 19" LCD monitor. I had to guilt my significant other after buying her a diamond earrings set for Valentine's day.

Consider it a Valentine's gift. Besides, the price for the monitor - Acer AL1916 was pretty affordable - $159.99 (+ 7% Sales Tax = $171.18) from Staples (on Feb 18/2007). Most online sites on that day were selling the same monitor for $215+.

Add to that most online reviews gave a two thumbs up to this monitor and I must say, I am pretty pleased.

Oh...for those who know me (and who will sooner or later), I am a laptop user (IBM Thinkpad T41 loaned to me by my employer, FTR - For The Records i.e ). So how do I enjoy power of the Dual Monitor.

Fortunately Windows XP makes it pretty easy. It was literally plug and play.

As I type this, I use the external monitor as the primary monitor at 1280 x 1024 resolution and the monitor of the laptop as an extension at 1024 x 768.

I must admit, I sometimes have to tweak the configuration to make the external monitor the primary one, but is worth the struggle.

Has it made me more productive? Only time will tell.

Has it increased my coolness factor... sure by 10 folds... :)

... Enjoy people !!!