3rd solution in this article did it for me - deleting associationset tags and removing the relationship lines from the canvas...
http://adventuresinsoftware.com/blog/?p=306
Monday, August 16, 2010
Thursday, February 11, 2010
Debugging WSS 3.0/SharePoint 2007 workflows in a 64-bit environment
You may have found that you are unable to attach to a process to debug a workflow project in a 64-bit environment, getting an access denied error. I hated having to switch environments from a x64 to x32 just to debug.
Well I stumbled upon a way to do it in x64, and it is pretty easy. I use VS2008 in Windows Server 2008 x64.
I don't know why, but the basic point is that you can't attach to a process with the Workflow code type in x64. If you have run a workflow in the current running process, it will be selected in the code type to debug, giving the access denied error, thus debugging won't work. But, if you attach to the w3wp process BEFORE it does that (after restarting IIS and before running any workflows), once you run the workflow it will debug.
So basically,
- Restart IIS
- Warm up a page, but DON'T run a workflow
- Attach to the process(es)
- Run the workflow, and your breakpoints should be hit.
Labels:
64-bit,
Debugging,
sharepoint 2007,
visual studio,
workflows,
WSS 3.0,
x64
Friday, October 30, 2009
Mashing up data from many, many site collection in SharePoint
One large architectural hurdle our team faced recently was displaying real-time data from across thousands of site collections. A normal query across this many site collections (or even sites) would be unacceptable (can easily go over 30 seconds).
This is useful for a variety of reasons, such as if data is needed for a variety of dynamic and summary displays on a user's homepage (dashboard).
Part of the solution for us was to query the search index for the data. This was already 'queried' and indexed as part of the seaech engine's regular crawl operations, so the data was there and available to query via the object model or a web service. This won't however, give you a real-time view of the data, only the data since the crawler last ran a crawl (this could be every 5 minutes, or much longer).
So the second part of our solution involved creating a cache of the list from across the site collections. Each list got an event receiver to create a cache of list items whenever added, changed or deleted.
There was a master cache list on a top-level site collection, and now we can query that and the search (take whichever is newer), and get real-time mashups of data, from across thousands of site collections.
Subscribe to:
Posts (Atom)