0

Hmm, So Apparently TFS 2012 Power Tools Require VS Pro or Better

by Angela 19. November 2012 15:37

So I had gotten used to installing a VS 2010 Shell on my TFS app tier for doing basic administration type activities that required a Team Explorer. One of my most common tasks was editing the TFS process template using the TFS Power Tools. So when I upgraded TFS to 2012, I immediately downloaded the TFS 2012 Team Explorer and Power Tools and installed them so I could get to work.

Today I discovered that is no longer a supported scenario once you have upgraded to TFS 2012, not that the error message is AT ALL helpful for figuring this out, shocking. I loaded up the VS Shell, opened Tools | Process Editor | Work Item Types | Open WIT from Server like I always do

image

and got a strange error I hadn’t seen before. I tried a few other options, projects, work item types, kept getting errors. I was able to export work items, just not open them. ::sad trombone::  So this is an error you might end up encountering after upgrading if you haven’t seen the update I am talking about.

image

Cannot load ‘C:Users37653\AppData\Roaming\Microsoft Corporation\Microsoft® Visual Studio® 2012\11.0.50727.1\usnbka366p_Str_Enterprise_User Story,wit’: Could not load file or assembly Microsoft.VisualStudio.XmlEditor,Version=1 1.0.0.0, Culture=neutral,PublicKeyToken=b03f5f7f11d50a3a or one of its dependencies. The system cannot find the file specified.

 

When I dug around, I discovered a few MSDN posts referring to a licensing change for VS 2012.  I suppose if I still worked at Microsoft I wouldn’t have missed that valuable little nugget. So no longer can you get away with a free VS Shell and the Power Tools for simple administrative tasks on your server, you must install at LEAST VS Professional.  Lame.

If you are lucky, like me, your boss bought you a copy of VS Ultimate and it’s not an issue since with MSDN benefits, you can install it on pretty much any server YOU are going to use. Just be sure if it is a shared server, that everyone is properly licensed for whatever you install there. And alas, this is at my client, so now I need to work with their server folks to get that installed and make sure they are licensed properly for it ::sad face::

Tags:

ALM | Application Lifecycle Management | MSDN | Power Tools | SDLC | TFS 2010 | TFS 2012 | TFS Administration | TFS Power Tools | Team Foundation Server

0

Multi-Tenant TFS Data Tiers? Yes You Can!

by Angela 6. November 2012 08:27

Multi-what TFS? In other words, hosting multiple instances of Team Foundation Server data tiers and all of their associated databases on the same data tier.

So we ran into quite the conundrum here, wherein we had just one physical server available to act as a TFS Data Tier, but needed to host at least 2 TFS 2010 instances on it to try some stuff out in relation to a coming upgrade. I needed to upgrade a number of our project collections to TFS 2012, while leaving some number still on TFS 2010 until we could do further validation on some customizations. It seemed risky, maybe even impossible, but mostly because I had never tried.  I certainly never saw that as an option in the installation docs or on MSDN.  It wasn’t until I sat down with a DBA who looked at it purely from a database perspective that I thought to just give it a try and see what happened. Obviously this is a development environment and NOT their production TFS Smile  You certainly COULD do this in production but it would make me nervous when it came to things like DR, so I’m not going to even entertain that notion.  But, in my situation, I already had a dual tier TFS 2010 environment setup in DEV, and I had a second AT server to use as a test bed for the upgrade to TFS 2012, but my main issue was how I could take collections from a single TFS instance, and upgrade only half of them to 2012 while the others were still available on 2010. I wondered, “can I upgrade the new app tier to 2012 while leaving the other app tier, hitting the same data tier, on TFS 2010?” The answer is, “sure you can!” 

we_can_do_it

Now if you look at TFS merely from the front end perspective this might seem odd, or risky, but like I said, I had a DBA who knew nothing about TFS but knew databases really well helping me to noodle through it.  I knew just enough about SQL Server to be dangerous, so together we made quite the team when it came to “let’s just try it and see what happens, it’s only DEV after all!”.  What I came to understand, and maybe I should have realized this sooner, is that when you upgrade TFS, or do any operations on it from the App Tier, it only affects the databases that are referenced by its configuration database.  So, 3 separate App Tiers have 3 separate Configuration databases, and 3 separate sets of databases (Collections, warehouse, etc.) that can coexist on a single data tier. So upgrading an AT from TFS 2010 to TFS 2012 only updates the schemas of the databases specified in the Configuration database associated to that AT.  Main requirement here is that it is a version of SQL that can support both products, so SQL 2008 R2 + current Service Packs.

So here is what I am running today:

image

Looking back, knowing what I now know, it makes sense too. Now, once again, I spent many many hours researching this on-line and could not find any documentation to confirm or deny that this was even possible. It took a few emails to some folks in North Carolina, you know – the dudes who WROTE the software – to confirm that yes indeed, you can host multiple instances of TFS on a single Data Tier. Turns out, they do it too! So I was pretty stoked to discover that I could in fact host 2 different TFS instances on a single Data Tier machine AND that it was a supported (although completely undocumented) scenario.

Rad huh? When you dig into the SQL Server instance it can become a confusing mess of config databases, and collection databases to manage, but it can also be a useful thing to know for upgrade and testing scenarios where you simply cannot get additional hardware for the DT.  Now yes, this absolutely can make things tricky for the DBA too if you are not using the TFS Backup and Restore Tools for backing up data. I certainly recommend using the built in TFS Backup tools if it is an option. But that is a discussion for another day… and another blog post.

I will happily accept dark chocolate in tribute Smile

Tags:

ALM | Agile | SDLC | Power Tools | TFS 2010 | TFS 2012 | TFS Administration | TFS Power Tools | Team Foundation Server

0

Why Isn’t TFSService In My Service Account Dropdown List?

by Angela 5. November 2012 09:45

Ever been migrating a TFS 2010 server and when you got to the place in the Application-Tier Only Wizard where you had to specify a Service account and POOF, your TFSService account did NOT appear as a possible option? Ruh-roh!  This is a known issue in TFS 2010, and you won’t encounter this in 2012 thankfully, but nonetheless. If it happens to you, hopefully this also works for your implementation!

Untitled

Now you certainly don’t want to be specifying a user account for this, but what on earth is a TFS admin to do? I got into this situation and fear not, there is NOTHING documented on-line to help you ::maniacal laughter:: Maniacal mostly because I beat my head on my desk for at least half a day trying to figure this out.  Nothing I could find on MSDN, the MSDN forums or any other searchable resource shed any light on the issue. I found the solution by calling in a favor with a couple of folks I know on the TFS product team.  I might seriously send them a cookie basket for being so awesome.  Seemed silly not to share my good fortune because this is a DOOZY if you ever run into it yourself.

Turns out, the values that go into this dropdown get collected by taking a poll of all of the TFS related SQL databases (configuration, warehouse, collections) referred to by the configuration file selected in the previous step. Obviously you need to select an account that can access all of the databases.  The account should a) not be dbo, b) not be db_owner, and c) needs to be a valid user with TFSADMINROLE and TFSEXECROLE. In my case, some folks had been having issues creating new Team Project Collections (because their TFS Admin accounts did not have proper permissions on the Data Tier) and so they logged into the AT as TFSService to create the collections ::head explodes::  Doing that makes TFSService dbo and dbo_owner and therefor pulls its name out of the proverbial hat to be used as the service account going forward.

So how do you fix it? a) make sure your TFS Admins have the appropriate rights on all of the servers they need to get their jobs done going forward and DO NOT take no for an answer.  Trust me, it’s brutal otherwise; b) Take TFSService OUT of the administrators group on the local server so no one can login as that user in the first place; c) go fix the TFSService account in the TFS related databases in SQL Server. This may seem scary, but I don’t know of another way.  Ask your DBA if you need to, it’s possibly their fault you got in this situation anyway Winking smile 

So what you need to do in SSMS to fix it?

  1. 1) Iterate through all of the TFS databases and change the Owner to something OTHER than TFSService; this will also reset the login associated to the dbo user. Keep in mind if this user is already in the Users group for that database, then they will need to be deleted from there first.
  2. Untitled

2) Add TFSService as a database user (Database | Security | Users –> New user…)

3) Assign them the following roles: TFSADMINROLE and TFSEXECROLE.

Untitled

 

And after you’ve given yourself carpal tunnel with the billion mouse clicks necessary to do this, you can restart the Application Tier Only wizard and you will find that now TFSService appears in your list. HUZZAH! ::throws confetti::

Untitled

Now ideally you will never get into this situation in the first place, but if you do, it’s not really documented other than this blog post – at least not that I know of. BIG THANKS to Brian MacFarlane and Ed Holloway on the TFS Product Team for helping me noodle through this issue.

Tags:

ALM | Application Lifecycle Management | MSDN | TFS | TFS 2010 | TFS 2012 | TFS Administration | Visual Studio

0

So I ran into this issue today while creating a TFS 2010 Backup Plan

by Angela 31. October 2012 13:30

So as you would expect, I as a consultant do not have god-like access to things in production like I do in the dev and test environments.  So occasionally I get tripped up on access rights, and when it comes to TFS, well, they could do a much better job of listing out all the places where you do and do not need Admin rights, sysadmin rights, farm admin rights… Well, it’s all out there between the Ranger Guidance, best practices documents, install docs and MSDN documentation but you have to do a LOT of cross referencing to get it all.  And sure, ideally anyone who is a TFS admin would be able to just ask nice and smile and get all those rights, but this is the real world and many large companies are PARANOID about handing out access like that to production.  I had to fight to get the minimal rights documented in the TFS guidance, let alone anything extra.

While upgrading TFS 2010 to 2012 at this current client, I am stopped dead in my tracks at least a few times a week, sometimes a few times a day, by “Access Denied”. My most recent one was extra tricky because it involved a Power Tool and as you know, those are often not documented very well. So, on to my story…  I was setting up a Backup Plan on TFS 2010 using the nifty Power Tools feature (see screen below) from the Admin console.  I login to the TFS application tier with my account, a TFS Admin user.  I know that my account has sysadmin rights on SQL because I am a TFS Admin, and when it comes time to providing the account to run the backup plan under I provided the TFSService account which I know has Administrator and sysadmin rights on the data tier server:

image

So between those two accounts I would think everything was OK. I don’t know for sure, but if the Backup Plan is running as the TFSService account the way it is setup here, well that account is king of the world so everything should “just work”. And yet:

clip_image002

So to hopefully make this something that comes up when someone else does a search on this message, here is what I saw - “Error    [ Backup Plan Verifications ] The current username failed to retrieve MSSQL Server service account. Please make sure you have permissions to retrieve this information.” 

WTH?! And when I opened up the error log the first error I encountered was:

TFS upgrade xp_regread() returned error 5, 'Access is denied.' xp_regread() returned error 5, 'Access is denied.' 

Again, WTH?!

So the DBA goes off and starts researching what xp_regread() does, and tried to figure out why this isn’t an issue in our dev and test environments given that everything was setup the same, and I start digging through forums.  Finally I find one sad and lonely little post on the MSDN forums related to the issue that recommends 1) logging in as a TFS Admin user (OK, I’m with you) and 2) “ensure that the user who perform this Backup Plan have required permission in SQL Server”.  Wait, what?  Be more specific please. What *ARE* the required permissions??  This happens all the time. Don’t tell me to “make sure you have appropriate permissions” without clarifying what those are. Otherwise, well, duh! I *think* I have the right permissions but clearly I am mistaken.

I dig through the Ranger Guidance which as far as I can tell is the only place this tool is documented.  It doesn’t say the person CREATING the backup plan has to be an admin on SQL, and it IMPLIES the account specified to run the job has to be an ADMINISTRATOR but only because the example specified a  Administrator account. Here, right from the guidance:

image

But even that doesn’t necessarily imply a SQL admin, and nowhere in the doc does it say what rights either account (logged in user or “Account”) should have. I just went back and read it AGAIN, does not say anything IRT rights of either of those users in the Guidance. I suppose if you knew what it was doing behind the scenes you could infer the rights needed from the MSDN docs (I found this later). I made an educated guess that because in dev and test I am a server Administrator on the DT, and the Backup worked just fine there, that me being a SQL Server Admin must be a requirement.  So I logged back into my production TFS AT with another account that I knew was admin on every server in the TFS implementation (I know, I know), and the backup plan was created just fine. .

Our DBA does NOT like making TFS admin accounts SQL Administrators, but if I can show him explicit rules that say YOU CANNOT DO YOUR JOB AS A TFS ADMIN WITHOUT IT, he will do it.  So please Microsoft, don’t make it so darn difficult to divine what rights all of the accounts need for the various tasks the user will do. Particularly the Power Tools which make people nervous anyway.

Tags:

ALM | Application Lifecycle Management | MSDN | Team Foundation Server | TFS | TFS 2010 | TFS 2012 | TFS Administration | TFS Power Tools | TFS Rangers

0

So You Were Forced to Use the dreaded TFS Collection /Recover Command, Now What?

by Angela 11. October 2012 08:23

Since we have used Recover on a production database and lived to tell the tale I thought I would share our experiences. If you read this post you will know that one of my client’s got themselves into a world of hurt where we needed to restore a nightly backup that was not detached.  I know, I know, detached backups are the way to go.  Well, now THEY know that too Winking smile  Nonetheless, sometimes you may find yourself needing to recover a TFS Team Project Collection (TPC) database, and if you’ve read the MSDN documentation you’ll know this is not an ideal situation. The Recover command is very lossy, BUT you get your data back. And in our case it was worth the risk.

So here is the backstory…  Someone deleted a Test Plan with a month’s worth of data in it, and if you know MTM you know there is no “undelete”. Restoring a backup was our only hope. BUT our nightly backups are SQL backups of the entire SQL Server instance, so undetached (we are addressing this NOW). Plucking one TPC out of there and attaching it is just not an option. And we did not have hardware to restore the entire thing and detach it properly.  So here is what we did:

  1. Restore the backed up TPC from the nightly backup into our dev TFS environment
  2. Used the TFSConfig /Recover command, followed by TFSConfig /Attach to get it attached in dev
  3. Used the TFSConfig /Recover command to get the TPC into the proper state
  4. Detach the hosed TPC from production
  5. Restore that detached version of the TPC to production
  6. Attach the backup to production (we actually hit an interesting bug in TFS 2010 at this point, so the attach was quite harrowing and involved an emergency hotfix to our TFS sprocs, I may blog about later.)

Now, I would love to say everything was perfect but the recover command did blow away some things that we had to get back into place before people could use the TPC again.  What we lost:

  1. All the security setting ever!
    • Collection level groups and permissions
    • Team Project (TP) level groups and permissions in every TP in the TPC
    • Permissions around Areas and Iterations in every TP in the TPC
    • Permissions around Source Control in every TP in the TPC
  2. SharePoint settings  (in every TP in the TPC). Settings on the SharePoint server themselves will be fine of course but you will probably see a “TF262600: This SharePoint site was created using a site definition…” error when you try to open the portal site that was once attached to those TPs. You will need to fix this in 2 places.
    • Go to TFS Admin Console, select the TPC you just restored and make sure the SharePoint Site settings for the TPC are correct. It will probably be set to “not configured” now.
    • Open team explorer (as an Admin user), and for each TP go to “Team Project Settings | Portal Settings” and verify everything there is correct. Ours were just plain gone so we had to enable the team project portal and reconfigure the URL.
  3. SSRS Settings – this will probably be fine if you restored the database as-is but we also renamed it as part of the restore, and so had to update the Default Folder Location through the Admin Console for the TPC in order for this to work again.

So word to the wise, make sure you understand what the settings above are for all of the TPs in your TPC BEFORE you perform a Recover command because chances are you will have to manually set them all back up.

Tags:

ALM | Application Lifecycle Management | MSDN | MTM | Microsoft Test Manager | Microsoft Test Professional | TFS | TFS 2010 | Team Foundation Server | VS 2010 | Visual Studio | TFS Administration

0

So you accidentally deleted your MTM Test Plan, Now What?

by Angela 10. October 2012 04:14

So this week, we had a little bit of fun, by which I mean a day that started with panic and scrambling when someone accidentally deleted a Test Plan (yes, a whole test plan) in MTM in production. A well established test plan with dozens of test suites and over a hundred test cases with a month’s worth of result data no less... Some important things of note:

  • test plans are not work items, they are just a “shell” and so are a bit easier to delete than they should be (in my opinion)
  • there is no super secret command-line only undelete like there is for some artifacts in TFS, so recreate from scratch or TPC recovery are your only options here to get it back
  • when you delete a test plan, you lose every test suite you had created.  Thankfully, not test cases themselves, those are safe in this situation.  Worst case, a plan can be created, although it is tedious and can be time consuming.
  • when you delete a test plan, test results associated with that test plan will be deleted*. Let that sink in – ALL OF THE TEST RESULTS FOR THAT TEST PLAN, EVER, WILL ALSO BE DELETED.  ::this is why there were flailing arms and sweaty brows when it happened::

So at this point, you may be thinking it’s time to update your resume and change your phone number, but hold up. You may have some options to recover that data, so buy some donuts for your TFS admin(I like cinnamon sugar, BTW).  I should mention, there may be a lot of other options but these are the three I was weighing, and due to some things beyond my control we had to go with #2.

1) Best Case Scenario: restore your DETACHED (this is required) team project collection database from a backup, cause you’re totally taking nightly backups and using the TFS Power Tool right? You lose a little data depending on how old that backup is, but it may be more important to get back your test runs than have to redo a few hours of work.

2) Second Best Case Scenario: If you cannot lose other data, and are willing to sacrifice some test run data, then restore the TFS instance from a traditional SQL backup to a separate TFS instance (so, NOT your production instance), open up your test plan in that secondary environment, and recreate your test plan in production.  Not ideal, but if you didn’t have a ton of test runs this may be faster and you don’t sacrifice anything in SCM or WIT that was changed since the backup was taken.

3) Worst Case Scenario: if your backups were not detached when you did your last backup, cry a little, then use the recover command to re-attach them. The gist is to use the TFSConfig Recover command on the collection to make it “attachable” again, then attach it to your collection. I have written a separate post on this because it can be complicated…

Once you are back up and running, make sure rights to managing test plans is locked down!  It might not be obvious that you can even do this, or where to find it, since it is an “Areas and Iterations” level permission. But do it, do it now!  This permission controls the ability to create and delete Test Plans, so be aware of that. But for the most part, anyone with authority and knowledge to delete entire Test Plans, considering what they contain, should be the only person creating them.  If everyone needs the ability to create/delete these willy-nilly, then you are doing it wrong, in my opinion anyway.

I am still in the midst of getting this back up and running so will update once we’re finished. There is an MSDN forum post out there regarding one bug I seem to have uncovered, if anyone wants to look at it and maybe fix my world by answering it Smile I am sure I’ll be able to add some more tips and tricks by then.

0

An interesting Quest (pun intended)…into Agile testing!

by Angela 9. May 2012 08:57

So there is a fantastic little conference gaining steam in the Midwest called Quest, which is all about Quality Engineered Software.  If you’ve never heard of it, you should seriously check it out next year regardless of your role.  As I have always said, Quality is NOT the sole responsibility of the testers, and this conference has something for everyone.  I was fortunate enough to be introduced to the local QAI chair who runs the conference the first year it ran (2008), which lucky for me also happened to be in my back yard.  I was with Microsoft at the time, and we had opted in as the biggest conference sponsor, cause let’s be real - who on earth in QA ever thought “Yeah, Microsoft has some awesome testing tools”.  ::crickets::  Right.

At the time VSTS (remember THAT brand? Smile with tongue out) was still new-ish, and the testing tools were focused almost entirely on automated testing. Yeah, I know, TECHNICALLY there was that one manual test type but let’s not even go there.  I know a few, like literally 3, customers used the .MHT files to manage manual tests in TFS, but it wasn’t enough. The automated tools were pretty awesome, but what we found was that MOST customers were NOT doing a lot of automation yet. Most everyone was still primarily doing manual testing, and with Word and Excel, maybe SharePoint. We had a great time at Quest talking to testers and learning about what they REALLY need to be happy and productive, we got the word out on VSTS and TFS, and started planning for the next year.  I was able to be part of Quest as a Microsoftie in early 2009 as well, when the 2010 tools (and a REAL manual test tool) were just starting to take shape, and then the conference spent a couple of years in other cities.  Fast-forward to 2012 when Quest returned once again to Chicago.

I was no longer a Microsoftie, but if you’ve ever met me you know that working a booth and talking to as many people as possible about something I am passionate about is something I rock at, and enjoy! So I attended Quest 2012 again this year, this time as a guest of Microsoft.  I worked the Microsoft booth doing demos and answering questions about both the 2010 tools and the next generation of tools, and WOW did we get some great responses to them.  Particularly the exploratory testing tools.  I am pretty sure the reverse engineering of test cases from ad-hoc exploratory tests, and 1-click rich bug generation that sent ALL THE DATA EVER to developers gave a few spectators the chills. I certainly got a lot of jaws dropping and comments like “THIS is a Microsoft tool?!” and “I wish I had this right now!”. It was pretty great.

I was also fortunate enough to also get to attend a few pre-conference workshops, keynotes and a session or two.  I have to say, WOW, the conference is really expanding, and I was very impressed with the quality of the speakers and breadth of content.  As a born again agilista, I was so pleasantly surprised to see an entire TRACK on Agile with some great topics.  I was able to attend “Transition to Agile Testing” and “Test Assessments: Practical Steps to Assessing the Maturity of your Organization“ and learned quite a bit in both sessions.  One disappointment, there is even more FUD out there in the QA world than what I see in the developer world when it comes to Agile, what it actually means and how it SHOULD be practiced.  I’m not about being a hard core “to the letter” Scrummer or anything, but I also am not about doing it wrong, calling it Agile, and blaming the failure on some fundamental problem with Agile.  There are lots of Agile practices that can be adopted to improve how you build, test and deliver software, without going “all in”, and that was something I kept trying to convey whenever I spoke up.

I heard “Agile is all about documenting as little as possible”, “Agile lacks discipline”, “Agile is about building software faster”, and all of the usual suspects you would expect to hear.  No, it’s about "documenting only as much as is necessary; there is a difference!  Agile requires MORE discipline actually.  People on Agile teams don’t work faster, they just deliver value to the business SOONER than in traditional waterfall models, which sure, can be argued is “faster” in terms of time to market.  The only thing that will make me work faster would be a better laptop and typing lessons.  I still look at the keyboard, I know :: sigh::   I am highly considering doing a session next year on Mythbusting Agile and Scrum, to help people understand both the law and the spirit of Agile practices.  Overall it was great to see that the QA community is also embracing Agile and attempting to collaborate better with the development side of the house. We just need the development side to do the same Winking smile  I also met at least a dozen certified Scrum Masters in my workshops as well, which was great to see! 

One of my favorite parts of the conference was of course getting to catch up and talk tech with Brian Harry.  He was the first keynote presenter of the conference, and spoke on how Microsoft “does Agile”, the failures and successes along the way, and even spent some time talking about his personal experiences as a manager learning to work in an Agile environment. I.LOVED.THIS. Yeah, I’m a bit of a Brian Harry fan-girl, but it really was a fantastic talk, and I had many people approach me in the booth later to comment on how much they enjoyed it. My favorite part was Brian admitting that at first, even HE was uncomfortable with the changes. It FELT like he was losing control of the team, but he eventually saw that he had BETTER visibility and MORE control over the process, and consequently the software teams.  It was brilliant.  So many managers FEAR Agile and Scrum for just those reasons. It’s uncomfortable letting teams self organize, trusting them to deliver value more often without constant and overwhelming oversight by project managers, and living without a 2 year detailed project plan - that in all actuality is outdated and invalid as little as a week into the project.  Wait, WHY is that scary? Sorry, couldn’t let that get by.

And so off I go again, into the software world, inspired to keep trying to get through to the Agile doubters and nay-sayers, and to help teams to adopt Agile practices and tooling to deliver better software, sooner.

Tags:

Agile | ALM | Application Lifecycle Management | TFS 2010 | SDLC | Team Foundation Server | Testing | Test Case Management | User Acceptance Testing | VS 11 Beta | VS 2010 | Visual Studio | development

0

Upgrading Team Projects from Agile 4.2 to Agile 5.0 on TFS 2010–Part 3, Swapping in the QoS Requirement

by Angela 28. March 2012 07:33

So if you’re reading this you are probably finishing up my 3 part story about updating a process template from Agile 4.2 to Agile 5.0 on a TFS 2010 server.  This is the last installment where I embarrass myself further by sharing one more stumbling block that I encountered along the way.  So now we have all of our awesome tools installed, we downloaded Hakan’s script, we got our work item definitions imported and updated, and finally added our trusty old Quality of Service Requirement to the new Requirements Category in the process template.  Everything was working beautifully until I went and tried to link a QoS Requirement to a Test Case. Cue Sad Trombone again…

image

This was certainly not handled in any script, and I couldn’t find any documentation of it on MSDN, so hey, maybe this is something actually NEW in terms of guidance Smile  As soon as I saw this I knew what was happening.  I was pretty sure that somewhere there was some XML specifying what work item types were allowable in that dropdown, and my guess was QoS Requirement was not one of them.  I would have thought it was covered in the updated TestCase.xml used by Hakan’s script, or that maybe it was using the “Requirement Category” from Cateogires’xml but that would have covered QoS Requirement.  I double checked and it was not.  Here is the xml included with the script, note only “User Story” is allowed here:

image

I went ahead and made a little tweak so that QoS Requirements were allowable for the “Tested User Story” functionality and re-imported the TestCase work item definition using the Power Tools.  Essentially all I had to do was add my work item type to the TypeFilter node in the XML:

image

And now when I click “New” or “Link To” from a TestCase work item, I have access to my Quality of Service Requirements, HUZZAH!

image

Now, I am sure this is intentional. I assume in most cases you really only want “User Story” type work items to be linked in this particular tab, but for our purposes this is what we are looking to do.  I was a little curious as to why Hakan’s update script did not include the User Story work item type definition…  But hey, at the end of the day I demystified some more of the “magic” going on behind the scenes in TFS.  I am currently digging in a bit more to figure out if it makes sense to add User Stories to these upgraded Team Projects as well, since there are some very different fields and metadata being collected on them.  As I mentioned before, these are mostly inactive projects I am “experimenting on” so I’d love to hear and feedback or opinions on what you have done with your own projects.

OK, one last pro tip before I go. How often do you get an error dialog from TFS or VS, and you want to Google or Bing it, but now you have to type in all of the text by hand and hope you don’t miss a letter or number?  For me, daily!  *Sometimes* you can copy and paste the test, sometimes there is even a tool or link to let you copy it, but often times you are on your own. I totally ran into this on accident the other day.  So in OneNote you have a great screen capture tool that will work in any app, even on the desktop.  Make sure you have opened the OneNote app at least once, and seriously if you haven’t you’re crazy cause it’s the only tool I use for taking and sharing notes.  Hit Control+S, drag the cross hair around what you want to capture and let go. Copy the image to your clipboard and paste anywhere.  Cool huh? It gets better.  I noticed if you right click an image, you get the option to “Copy text from picture”. I saw that and thought, “no way that works!”, and lo and behold it does.

image

You’re welcome Smile

That’s it for now, hope you learned something in reading about my adventures in process template upgrading.

 

Part 1 – Process and Tools

Part 2 – Field Mismatches

Tags:

ALM | Agile | Application Lifecycle Management | MSDN | Power Tools | SDLC | TFS | TFS 2008 | Team Foundation Server | TFS 2010 | Test Case Management | VS 2010 | Visual Studio | Work Item Tracking

0

Upgrading Team Projects from Agile 4.2 to Agile 5.0 on TFS 2010– Part 2, Field Mismatches

by Angela 28. March 2012 07:05

So hopefully you’ve already scanned through this other post where I cover the overall process I used for doing my updates. It also has some great tips and tricks for making the whole job easier using a few free tools, as well as a few links to helpful blogs and MSDN resources to save your sanity! So, that being said, here are some of the issues I encountered during my upgrade, and how I was able to work around or fix them.  Again if you are using Hakan’s script and just running as-is, you might not see some of these errors.  I just figure you learn more by screwing up, and I was working with some test projects and so had the luxury of being able to try out several different strategies without affecting anything critical, and so I did a lot of things by hand first.

First stumbling block I encountered during the upgrade was a weird issue with inconsistent “friendly names” for some of the fields.  Essentially, I had some naming collisions when I tried to import some of the new artifacts like SharedStep.xml and TestCase.xml.  You might at some point encounter an error message similar to “TF26177: The field XxxXxx cannot be renamed from ‘XxxXxx’ to ‘Xxx Xxx ’.” In other words, “Area ID” vs. “AreaID”, “Iteration ID” vs. “IterationID” and a few others.

image_thumb[6]

The ones I was importing had field names that didn’t match EXACTLY. Now I started thinking, “But I am just re-uploading the work item type definitions that TFS was ALREADY using. They should be exactly the same right?”.  I opened up the work item type definitions (thank you TFS Power Tools) and found that indeed, some of the field names did NOT match the ones on the server. You’ll note in the screen shot below that in just a handful of cases, a blank character was missing from the field name, so the import process sees this as a rename attempt. You are looking at a new Agile 5.0 Team Project work item definition on the right, and the standard Agile 5.0 Team Project work item definition used to create that new project on the left.

image

In essence, what I ended up having to do to rectify this was to go in and modify the work item template definitions for the appropriate work items to ensure that the field names being imported matched the field names on the server, before attempting to import them again. For me, it was an issue in both the ShareStep and TestCase work item type definitions, but certainly didn’t take long to fix.  Once that was done, I had success! 

image

UPDATE: Turns out some of the fields being used were of course already defined on the server from the previous implementation of TFS 2008, and when TFS 2010 was released a few of the names had been altered slightly. After struggling with this for an hour or two and somehow not running across the documentation stating that this was a known issue, I eventually figured out the fix on my own.  Today, I was kindly pointed to a couple of places where this was documented, including a post by Gregg Borr that was pretty much written specifically to address this very situation ::facepalm::

Last thing we need to do is update the categories.xml. Silly me tried just importing the Categories.xml from the Agile 5.0 template which will of course NOT work because 4.2 requirements were named a bit differently than 5.0 ones.  You’ll see something starting with “TF237059: The import of the category definition failed” because the new Categories.xml will refer to “User Story” and what you have is a “Quality of Service Requirement”.  I opened up the XML provided with Hakan’s script because I was wanted to verify what was happening, and what I was doing wrong, and was not shocked to see that it was essentially updating the “Requirement Category” to support the new world order of work item types. RTFM Angela, RTFM.  Here is what you will see in Hakan’s updated Categories.xml file:

image

So now my categories were imported correctly and I was feeling good but had to do some testing as I was SURE I would encounter some additional problems once I dug into Visual Studio and Microsoft Test Manager and started creating work item types in the new and improved Agile 5.0ish Team project.  It was definitely a trip seeing the co-mingling of the 2 versions of the Agile template in the Team Explorer:

image

For the most part this all “just worked”. I created work item types, linked them together, created hierarchies, opened them in Project and Excel and made changes and published. Life was good.  And then I tried to link a TestCase to a “requirement” in my new world. Wah, wah, wah, waaaaaaaah.  Check out my third post for details on how I managed to fix this.  Again, it could very well have been something I did wrong but it was a lesson learned…

 

Part 3 – Swapping in QoS Requirement for User Story

Tags:

ALM | Application Lifecycle Management | Agile | MSDN | Power Tools | SDLC | TFS | TFS 2008 | TFS 2010 | Team Foundation Server | VS 2010 | Visual Studio | Work Item Tracking

0

Upgrading Team Projects from Agile 4.2 to Agile 5.0 on TFS 2010–Part 1, The Process & Tools

by Angela 28. March 2012 06:51

So, I am NOT calling myself the absolute guru of Team Foundation Server work item tracking or process template upgrades just yet, but I did learn a ton during the process and wanted to share my experience in case you can glean some wisdom from it.  Now I leveraged a LOT of content written by other VERY smart people from the TFS product teams so I try to be sure to give credit where credit is due. I experienced some bumps and bruises along the way because I was following at least 5 different sources on the internet for the upgrade and they did not all contain the exact same info.  Since the same things might trip you up, give the post a quick run-through all the way to the end.  I know it’s a lot to sort through, but you’ve probably played Mine Craft for hours at a time so I am sure you can handle this. Also, I broke it into 3 parts since there is so much to cover and it WAS getting a bit ridiculous, even for me Smile 

We (Polaris Solutions) have a TFS 2010 instance that has a lot of legacy TFS 2008/Agile 4.2 Team Projects on it. I have been working on getting all of those projects upgraded to be more in line with the newest Agile 5.0 templates so that we can take advantage of some of the great features of TFS 2010 like Test Case management, hierarchical work items, new relationship types, and more. Now my life was made easier by the fact that this particular instance was not running SSAS/SSRS or SharePoint, and the original Agile 4.2 templates were unaltered, so really I was only concerned with making basic updates to the process templates and didn’t have to deal with reporting, merging customizations, or enabling the agile planning workbooks. But I still ran into some snags…

So, first thing is first, dump that chump command line and get a more functional one. You’ll spend some time at the command line so might as well not have it be painful.  I seriously owe Scott a big box of Sugar free brownies or something for this. Rocked my world!  If you’re not a fan of the command line, download the TFS 2010 Power Tools which will give you a lovely GUI interface for doing things like downloading your process templates and exporting/importing work item definitions and global types.  I actually switch between them depending on the task.  Also, the Power Tools just got updated in December of 2011, and some of the new features will bring a tear to your eye.

Next, download Hakan’s update script. Make note that you must be upgrading from a virgin Agile 4.2  process template to a 5.0 one to be able to just run it and be more or less done. If you have some customizations the script will still come in handy but you may need to tweak it, or reverse engineer your customizations back into the resulting template. Hope you documented all those changes, hehe. But don’t panic, the first thing his script does is back up your original template before changing it. Like I said, he’s a smart dude.  Here is what you will see when you grab the zip file and unpack it:

image

It includes a batch script to execute all of the necessary backup and import operations, a new Categories definition, 2 new link types (TestedBy and SharedStep), and new (SharedStep, TestCase) and updated (Bug, Scenarios, Task) work item definitions. Even if you decide not to use his script, these files could come in handy for manual updates as well, so I suggest grabbing it nonetheless.

Now honestly, I’m not one for blindly running ANY script, even one written by someone as awesome as Hakan, so I dug through his script line by line, looked at the artifacts it used, and compared it to the documentation on MSDN for updating old process templates to leverage new functionality.  I realized that the first couple of steps could be skipped altogether since I had some TFS 2010 projects in that collection already, and so things like categories and link types already existed.  You will see in a later post that this bit me in the butt Winking smile Uploading any of the template artifacts using Hakan’s script won’t hurt anything so long as you haven’t modified the ones on the server, import = overwrite in this case.  I did a lot of it by hand at first, using my spiffy new command line tool and the power tools just so I would understand better what was happening.  Once I was comfortable with that, I updated another project using Hakan’s script. Much faster process and way fewer errors, shocking right? Hehe.  I did run into a couple of errors that the script could not handle; one that I address in my next post that has to do with some “friendly name” value changes between 2008 and 2010, and another having to do with something that may or may not be a standard practice which I address in my third post.   For now my process is to start with Hakan’s script to get the imports done quickly, then make manual tweaks to bring the template the rest of the way in line to what we need. 

Now if unlike me you do also have SSRS/SSAS and SharePoint installed and need to turn on some other features, check out Aaron’s post and Allen Clark’s post on how to finish up the upgrade. Like I said, mine was a little less involved but those articles are widely held as THE ones to use as a guide if you find yourself in this situation as well. Hopefully I sprinkled in some tips and tricks to help you out as well Smile

One last thing you absolutely should have any time you need to work with process templates is a good compare tool.  I ran across 2 tools used for visualizing process template artifacts that saved my butt!  The first is the “Team Project Manager Tool” on CodePlex. Gives you a quick and easy way to look the XML in your template, do quick comparisons, but also includes tools to help you visualize/manage build templates, security and even source control. I love this tool SO MUCH!

image

And while in this instance I did not need it as much, the TFS Rangers Integration Platform Mapping Tool is a pretty neat tool and came in pretty handy too. When I had an instance where I wanted to quickly compare two WIT definitions and see how the fields mapped and where there were differences, it had some great visualization capabilities:

image

Well that’s it for my process and tools bit.  I have 2 more posts that cover issues I ran into with field mismatches, and making changes to allow for QoS requirement work items to take the place of User Story work items. They will be posted shortly!

 

Part 2 – Field Mismatches

Part 3 – Swapping in QoS Requirement for User Story

Tags:

Visual Studio | Team Foundation Server | TFS | MSDN | Application Lifecycle Management | Agile | ALM | Work Item Tracking | Requirements Management | Test Case Management | SDLC | Power Tools | TFS 2010 | TFS 2008 | VS 2010

Powered by BlogEngine.NET 2.7.0.0
Original Design by Laptop Geek, Adapted by onesoft