EMR and Device Integration

June 23, 2007

Biomedical/Bedside/ICU Device Integration

In the words of the guru Tim Gee – Medical Connectivity Consulting “Medical device integration is a critical (and an often overlooked) part of EMR planning. To be successful, any plan must take into account many more considerations beyond getting an HL7 feed into the EMR. Multiple stakeholders including nursing and clinical/biomedical engineering must be engaged. Putting together a successful long term plan requires negotiations across traditional hospital silos, and an in depth understanding of point-of-care workflows, medical device connectivity and device vendor offerings and product strategies”.

The benefits of automatic data collection (heart rate, invasive/non-invasive blood pressure, respiration rate, oxygen saturation, blood glucose, etc.) from acute care monitoring devices have become so obvious that all hospitals now require that their clinical information system (CIS), anesthesia information management system (AIMS), electronic medical records (EMR), electronic patient record system (EPR), or other hospital/healthcare information system (HIS) provide interfacing capabilities to biomedical devices – in order to ensure that key vital signs are stored in the Centralized Data Repository (CDR) – to track patient progress over time.

Patient monitoring systems are among the first to be integrated; because each HIS require at least basic patient vital sign collection. Integration with anesthesia devices is a must for any AIMS. Data collection from ventilation systems is required in most cases for ICU systems. Infusion device data integration is becoming increasingly requested in cases where CPOE systems are implemented.

But connecting to bedside medical devices and collecting data in your CIS or EPR is not as simple as it may seem. Device interface development is a specialized task that consumes resources and diverts attention away from core competencies. Competitive issues make obtaining device protocols difficult and sometimes impossible. Incomplete connectivity results in frustration and decreased efficiency of the hospital.

The various questions you need to have when integrating devices with a HIS are as below:

Categories of Medical Devices for Integration:

Vital Signs or Diagnostics devices
Infusion Pumps
Dialysis devices
Anesthesia machines
EKG and EEG devices
Endoscopy devices
Glucometers
Urimeters
Bedside devices
Oximeters with Patient Monitoring and Alarm Systems
Ventilators
Ultrasound devices
Stress testing devices

Type of Device Connectivity to the HIS

Wireless/Mobile
Fixed

Format of Message feed from Device(s) to the HIS

HL7 format result messages with possible Images, etc across TCP/IP
Proprietary format messages across TCP/IP
Binary format data across USB or others

Format of Message feed to Device(s) from the HIS

HL7 format ADT messages across TCP/IP
Proprietary format messages across TCP/IP
Binary format data across USB or others

Frequency and Location of Device Data Feed to the HIS

Continuous (Periodic) Real-time – 1 message per minute or less
Manual (Aperiodic) or on-demand
Server-based – with storage for real-time data and polling-frequency options
Location:ICU or PACU
Timing Syncronization among all the connected systems is important

Grouping of Device Data in the HIS is based on:

Patient Chart sections
Department Needs and Security Roles
Common Device Parameters
Dependent Device Parameters
Device Monitoring and Asset Tracking
Display and Storage of the data – claims, clinical encounters, drug/pharmacy, lab, images – captured and mapped to a common format, possibly ASTM’s Continuity of Care Record (CCR).

Security Issues:

Caregivers need access to validate device data onto the patient chart
Audit trail and enterprise timestamps on device data
High speed secure network with firewalls to protect ePHI
FDA guidelines compliance
HIPAA guidelines compliance
JCAHO guidelines compliance
Legal guidelines compliance

Examples:

Vital Signs mobile devices feed patient data to the EMR and a senior RN can review results before they are attached to the patients’ charts.
Infusion Pumps drug/fluid delivery tracking in EMR for long term critical care.
Enabling medical devices, such as infusion pumps, ECG machines and glucometers, to wirelessly send data from the ICU to a patient’s medical record or to a physician
Home care chronic disease monitoring systems that provide patient feedback, patient monitoring and alerts (to both patients and physicians) to the EMR.

Software for Device Integration with the HIS:

Capsule Technologie’ DataCaptor is a generic, third-party software + hardware suite that provides the most complete biomedical device connectivity solution available on the market. DataCaptor has the largest library of supported devices – more than 250 diverse bedside devices, advanced features, and easy integration with hospital information systems.

Stinger MedicalIntegriti – provides a secure and mobile method of transmitting patient vital signs wirelessly to the EMR.
Current Capsule Technologie – DataCaptor – OEM partners include (among other HIS vendors of all sizes)

Epic Systems (EpicCare) ,
Philips Medical Systems (CareVue Chart/IntelliVue through DeviceLink),
Eclipsys Corporation (Sunrise Clinical Manager) and
Surgical Information Systems (anesthesia software and surgical system).

Benefits of Device Integration:

As in several hospitals; the reasons for integrating devices is to automate the flow of data and interface it to the HIS application:
• To reduce transcription/documentation errors. Currently, nurses manually transcribe the data from scratch pads or from the devices onto the patient report resulting in problems like indecipherable handwriting, data in the wrong chart, vital signs written on scraps of paper (hands, scrub suits, etc.) that get forgotten, and then there is sometimes considerable lag between readings and reporting.
• To decrease documentation time. Significant increases in productivity can be gained by an interface that allows the nurse to validate rather than enter the data.
• To support quality data collection (charts, images, vitals) and to provide increased surveillance for critical patients – even when the care-provider is not present at the bedside. This allows for safe collection of data over time, thus providing a more accurate and valid history of patient progress.
• To increase patient safety. Safety is enhanced by decreasing data entry errors, and by allowing the nurse to review data collected when he/she is not present at the bedside. In addition the data can be captured at an increased frequency creating a more accurate depiction of the patient’s condition.
• To enable research and quality control. Data can be collected for future analysis by de-identifying patient demographics.
• To provide better patient care and more physician – patient contact time. A silent factor of a hospital’s revenue is quality of patient care. One of the chief drivers of quality of patient care is the quality of information provided efficiently to the Physicians though which they can make those critical decisions.
• To securely and quickly share assessment, diagnosis, treatment and patient progress data across facilities/RHIO (regions)/states thereby enabling the patient to be provided the best care anywhere.
• To reduce patient, physician and nurse stress and legal issues.
• To provide complete and comprehensive data on patient charts.
• To enable future devices to seamlessly connect to the existing EMR.
• To prevent errors in diagnosis, prescription and medication, by basing decisions on the entire patient history/allergies, the latest medications and the latest technology that are available to the patient and the care provider.
• Clinical (or Diagnostic) Decision Support Systems [ CDSS ] and Best Practice systems are more effective with a comprehensive and secure digital files (historical patient charts).
• To increase security and prevent tampering of Patient Records – since all data is digital and secured via layers of Role based security, by HIPAA and by Digital laws – the security is much more comprehensive than a system with voluminous paper records and difficult audit trails.
• Finally, to improve overall hospital throughput and patient hospital-visit time, success ratios and Improving Patient Efficiency Throughput.

I’ve linked the Capsule Technologie-DataCaptor architecture diagrams below to show the data flow between DataCaptor (the server), Concentrator (the ‘router’ or Terminal box), the bedside devices and the HIS and other systems.

http://capsuletech.com/images/stories/products/ConnectDC_470
http://capsuletech.com/images/stories/products/DC_Overview_520.jpg

Note:This article is based on personal experiences and public information gathered from websites including Medical Connectivity Consulting and Capsule Technologies and other medical device manufacturer’s web-sites. Thanks to these companies for this public information and this document is intended solely for personal reading and understanding of this technology and is not for any commercial gain.

Since PACS is a type of “Device Integrator”, the following is an addition to the above article:


Radiology RIS, PACS and the EMR Integration

The PACS – Picture Archiving and Communication System – is a filmless method of communicating and storing Xrays, CT/MRI/NM scans, and other radiographs that are verified by Radiologists after being acquired by the Xray, CT/MRI/NM machines and other variants used in the Radiology Department. Images may be acquired from a patient in slices and with 3D or 4D image reconstruction – the entire patients’ full body scan may be visualized on diagnostic quality workstations. Key images, Radiology reports and low resolution non-diagnostic images are provided for viewing on any screen – securely across the internet. If bandwidth permits – in certain cases – entire diagnostic quality images may be viewable, securely across the internet.

The RIS – Radiology Information System – enables “Radiology” patient scheduling, reporting/dictation, and image tracking to ensure that the PACS and the Radiology machines are effectively utilized and the patients’ structured reports are immediately available.

The EMR – Electronic Medical Records System or Hospital Information System – provides a “global” view or patient historical folder of the patients visits or encounters with his/her care providers. From a “Radiology” perspective – the EMR sends ADT/orders to a RIS and receives results including patient images and data from the PACS (via RIS) – thus enabling access to that patients Structured Reports in a single uniform location in the EMR. Thus, images can be integrated with the radiology report, and with other patient information systems’ (such as laboratory, pharmacy, cardiology, and nursing) reports, thereby providing a comprehensive folder on the patient.

Key Features of a good PACS System are:

  • Modules for comparison study of prior patient images, along with similar cases
  • Modules for Computer Aided Detection using Clinical Decision Support Systems and Key Facets
  • Excellent Data Compression Techniques to ensure effective network utilization and high speed transfers of quality images to workstations and other systems.
  • Excellent EMR Integration based on IHE Integration Profiles, standard HL7, standard DICOM and the support for secure,high-speed access to patient images via internet
  • Standard Security Features along with audit trails and Integration with RIS and EMR security.
  • Modules for 3D and 4D reconstruction of CT slices, Image Enhancement and Quality Printing
  • Immediate availability of Images on network or CD/DVD for quick diagnosis and review by remote Radiologists/experts.
  • Excellent Short Term Storage with very low retrieval time latencies.
  • Excellent Long Term Storage with decent retrieval time latencies and predictable data recovery.
  • Excellent RIS Integration.
  • Extensively tested and successfully working in other hospitals for 2 years at least.
  • Multiple vendor modality Integration features.
  • Downtime plan with Disaster Recovery Support.
  • Easy Upgrade-ability of hardware/storage to ensure almost infinite storage based on hospital need
  • Support for Patient De-Identification and Reporting off the PACS/RIS for data analysis.

Now that you have (selected) the PACS and RIS systems, here is the list of questions you should have regarding integration with the EMR:


EMR and RIS/PACS Integration Issues:

  • RIS/PACS features and limitations
  • Modality support for DMWL (Digital Modality Worklist – ensuring correct patient scans at modality)
  • Key Data Mappings between the RIS, PACS and EMR (eg. Study-DateTime, PatientID,Provider, Study Status, Accession number, etc.)
  • Department Workflow changes (Types of Orders, Downtime Orders, Unsolicited Results, Billing, etc.)
  • Data being displayed in the Modality Worklist and when does this worklist get updated?
  • Historical data import, cut-off dates, access policies to legacy data, etc
  • Security, User access and integrating the PACS/RIS users with the EMR users to enable secure web access to images.


Note:
The above article is based on personal experience and is not for any commercial gain.

Advertisements

Automated Workflow Environments and EMR

October 30, 2006

Well, we work in the next era of software development, not only designing applications, but also developing systems that communicate with each other, thus participating in a workflow.

Automating this workflow through the seamless integration of these apps is a task that challenges many of the industries that we work in.

Automated Workflow Environments are those systems where multiple systems contribute and communicate to enable a network of these apps to actually solve complex problems very efficiently, with no human interaction. You can call them Digital Ecosystems.

You can construct workflow nets to describe the complex problems that these systems efficiently solve. Workflow nets, a subclass of Petri nets, are known as attractive models for analyzing complex business processes. Because of their good theoretical foundation, Petri nets have been used successfully to model and analyze processes from many domains, like for example, software and business processes. A Petri net is a directed graph with two kinds of nodes – places and transitions – where arcs connect ‘a place’ to ‘a transition’ or a transition to a place. Each place can contain zero, one or more tokens. The state of a Petri net is determined by the distribution of tokens over places. A transition can fire if each of its inputs contains tokens. If the transition fires, i.e. it executes, it takes one token from each input place and puts it on each output place.

In a hospital environment, for example, the processes involved, show a complex and dynamic behavior, which is difficult to control. The workflow net which models such a complex process provides a good insight into it, and due to its formal representation, offers techniques for improved control.

Workflows are case oriented, which means that each activity executed in the workflow corresponds to a case. In a hospital domain, a case corresponds with a patient and an activity corresponds with a medical activity. The process definition of a workflow assumes that a partial order or sequence exists between activities, which establish which activities have to be executed in what order. Referring to the Petri net formalism, workflow activities are modeled as transitions and the causal dependencies between activities are modeled as places and arcs. The routing in a workflow assumes four kind of routing constructs: sequential, parallel, conditional and iterative routing. These constructs basically define the route taken by ‘tokens’ in this workflow.

Well, enough theory, how does this apply?

Think of this in practical terms using the example of a EMR* or CPR* System or HIS* System:
• A patient arrives at a hospital for a consultation or particular set of exams or procedures.
• The patient is registered, if new to the hospital. A visit or encounter record is created in the Patient Chart (EMR) – with vitals, allergies, current meds and insurance details.
• The physician examines the patient and orders labs, diagnostic exams or prescription medications for the patient possibly using a handheld CPOE*
• The patient is scheduled for the exams in the RIS – radiology info system or LIS – laboratory info system or HIS (hospital info system)
• The RIS or LIS or HIS sends notifications to the Radiology and/or Cardiology and/or Lab or other Departments in the hospital through HL7 messages for the various workflows.
• The various systems in these departments will then send HL7 or DICOM or proprietary messages to get the devices or modalities, updated with the patient data (prior history, etc.)
• The patient is then taken around by the nurses to the required modalities in the exam/LAB areas to perform the required activities.
• The patient finishes the hospital activities while the diagnosis continues and the entire data gathered is coalesced and stored in rich structured report or multimedia formats in the various repositories – resulting in a summary patient encounter/visit record in the Electronic Patient Record in the EMR database.
• There could also be other workflows triggered – pharmacy, billing,.
• The above is just the scenario for an OUTPATIENT, there are other workflows for INPATIENT – ED/ICU/other patients.

The key problems in this ‘Automated Workflow Environment’ are:

• Accurate Patient Identification and Portability to ensure that the Patient Identity is unique across multiple systems/departments and maybe hospitals. The Patient Identity key is also essential to Integrating Patient healthcare across clinics, hospitals, regions(RHIO) and states.
• Support for Barcode/RFID on Patient Wrist Bands, Prescriptions/Medications, Billing (using MRN, Account Number, Order Number,Visit Number), etc to enable automation and quick and secure processing.
• Quick Patient data retrieval and support for parallel transactions
Audits and Logs for tracking access to this system
• Support for PACS, Emergency care, Chronic care (ICU / PACU), Long Term care, Periodic visits, point of care charting, meds administration, vital signs data acquisition, alarm notification, surveillance for patient monitors, smart IV pumps, ventilators and other care areas – treatment by specialists in off-site clinics, etc.
• Support for Care Plans, Order sets and Templates, results’ tracking and related transactions.
• Quick vital sign results and diagnostic reporting
• Effective display of specialty content – diagnostic/research images, structured “rich” multimedia reports.
Secure and efficient access to this data from the internet
Removal of paper documentation and effective transcription
SSO-Single Sign On, Security roles and Ease of use for the various stakeholders – here, the patient, the RN, physician, specialist, IT support etc.
Seamless integration with current workflows and support for updates to hospital procedures
Modular deployment of new systems and processes – long term roadmap and strategies to prevent costly upgrades or vendor changes.
HIPAA, JCAHO and Legal compliance – which has an entire set of guidelines – privacy, security being the chief one.
• Efficient standardized communication between the different systems either via “standard” HL7 or DICOM or CCOW or proprietary.
• Support for a High speed Fiber network system for high resolution image processing systems like MRI, X-Ray, CT-SCAN, etc.
• A high speed independent network for real time patient monitoring systems and devices
• Guaranteed timely Data storage and recovery with at least 99.9999% visible uptime
• Original Patient data available for at least 7 years and compliance with FDA rules.
Disaster recovery compliance and responsive Performance under peak conditions.
• Optimized data storage ensuring low hardware costs
Plug ‘n’ Play of new systems and medical devices into the network, wireless communication among vital signs devices and servers, etc.
Location tracking of patients and devices (RFID based) and Bed Tracking in the hospital
Centralized viewing of the entire set of Patient data – either by a patient or his/her physician
Multi-lingual user interface possibilities (in future?)
Correction of erroneous data and merging of Patient records.
Restructuring existing hospital workflows and processes so that this entire automated workflow environment works with a definite ROI and within a definite time period!
• Integration with billing, insurance and other financial systems related to the care charges.
Future proof and support for new technologies like Clinical Decision Support (CDSS) – again a long term roadmap is essential.

ROI: How does a hospital get returns on this IT investment?

  1. Minimization of errors – medication or surgical – and the associated risks
  2. Electronic trail of patient case history available to patient, insurance and physicians
  3. Reduced documentation and improvement in overall efficiency and throughput
  4. Patient Referrals from satellite clinics who can use the EMR’s external web links to document on patients – thus providing a continuous electronic report
  5. Possible pay-per-use by external clinics – to use EMR charting facilities
  6. Remote specialist consultation
  7. Efficient Charges, Billing and quicker settlements
  8. Better Clinical Decision Support – due to an electronic database of past treatments
  9. In the long term, efficiency means cheaper insurance which translates to volume income
  10. Better compliance of standards – HIPAA, privacy requirements, security
  11. Reduced workload due to Process Improvement across departments – ED, Obstetrics/Gynecology, Oncology/Radiology, Orthopedic, Cardiovascular, Pediatrics, Internal Medicine, Urology, General Surgery, Ophthalmology, General/family practice, Dermatology, Psychiatry
  12. Improved Healthcare with Proactive Patient Care due to CDSS
  13. Quality of Patient Care: A silent factor of a hospital’s revenue is quality of patient care. One of the chief drivers of quality of patient care is the quality of information provided efficiently to the Physicians though which they can make those critical decisions

Now, the big picture becomes clear.

Doesn’t the above set of requirements apply to any domain? This analysis need not be applicable only to a hospital domain, the same is true for a Biotech domain (where orders are received, data is processed, analyzed, and the processed data is presented or packaged). Similarly a Manufacturing Domain, Banking domain or Insurance Domain etc.

The need is for core engine software – based on EDI (Electronic Data Interchange) – that integrate and help in the Process Re-Engineering of these mini workflows securely and effectively and using common intersystem communication formats like X-12 or HL7 messages.

These Workflow Engines would be the hearts of the digital world!

Buzzwords:
*EMR – Electronic Medical Record
*CPR – Computerized Patient Record
*CDSS – Clinical Decision Support
*RHIO – Regional Health Information Organization
*CPOE – computerized physician order entry

Some of the information presented here is thanks to research papers and articles at:
*Common Framework for health information networks
*Discovery of Workflow Models for Hospital Data
*Healthcare workflow
*CCOW-IHE Integration Profiles
*Hospital Network Management Best Practices
*12 Consumer Values for your wall

What about the latest IT trends and their applications in healthcare?

We already know about Google Earth and Google Hybrid Maps and the advantages of Web 2.0
The next best thing is to search the best shopping deal or the best real estate by area and on a hybrid map – this recombinant web application reuse technique is called a mashup or heat map.
Mashups have applications in possibly everything from Healthcare to Manufacturing.
Omnimedix is developing and deploying a nationwide data mashup – Dossia, a secure, private, independent network for capturing medical information, providing universal access to this data along with an authentication system for delivery to patients and consumers.

Click on the below links to see the current ‘best in class mash ups
*After hours Emergency Doctors SMS After hours Emergency Doctors SMS system – Transcribes voicemail into text and sends SMS to doctors. A similar application can be used for Transcription Mashup (based on Interactive Voice Response – IVR): Amazon Mturk, StrikeIron Global SMS and Voice XML
* Calendar with Messages Listen to your calendar + leave messages too Mashup (based on IVR): 30 Boxes based on Voxeo , Google Calendar
* http://www.neighboroo.com/ – Housing/Climate/Jobs/Schools
* Visual Classifieds Browser – Search Apartments, visually
* http://www.trulia.com/ – Real Estate/Home pricing
* http://www.rentometer.com/ – Rent comparison
* http://realestatefu.mashfu.com/ – Real Estate Statistical Analysis
* http://www.housingmaps.com/ – Rent/Real Estate/Home pricing – linked to Craigslist
* http://virtualtourism.blogspot.com/ – Google Maps + Travel Videos
* http://www.coverpop.com/wheeloflunch/ – Wheel of Zip Code based restaurants
* More sample links at this site (unofficial Google mashup tracker) http://googlemapsmania.blogspot.com/ includes some mentionable sites :
* latest news from India by map http://www.mibazaar.com/news/
* read news by the map – slightly slow http://lab.news.com.au/maps/v01
* view news from Internet TV by map – http://5tvs.com/internet-tv-maps/news/
* see a place in 360 http://www.seevirtual360.com/Map.aspx

What’s on the wish list ? Well, a worldwide mashup for real estate, shopping, education, healthcare will do just fine. Read on to try out YOUR sample…
OpenKapow: The online mashup builder community that lets you easily make mashups. Use their visual scripting environment to create intelligent software Robots that can make mashups from any site with or without an API.
In the words of Dion HinchCliffe, “Mashups are still new and simple, just like PCs were 20 years ago. The tools are barely there, but the potential is truly vast as hundreds of APIs are added to the public Web to build out of”.
Don also covers the architecture and types of Mashups here with an update on recombinant web apps

Keep up to date on web2.0 at http://blog.programmableweb.com/

Will Silverlight and simplified vector based graphics and workflow based – xml language – XAML be the replacement for Flash and JavaFX?

Well, the technology is promising and many multimedia content web application providers including News channels have signed up for Microsoft SilverLight “WPF/E” due to the light weight browser based viewer streaming “DVD” quality video based on the patented VC-1 video codec.

Microsoft® Silverlight™ Streaming by Windows Live™ is a companion service for Silverlight that makes it easier for developers and designers to deliver and scale rich interactive media apps (RIAs) as part of their Silverlight applications. The service offers web designers and developers a free and convenient solution for hosting and streaming cross-platform, cross-browser media experiences and rich interactive applications that run on Windows™ XP+ and Mac OS 10.4+.

The only problem is LINUX is left out from this since the Mono Framework has not yet evolved sufficiently.

So, the new way to develop your AJAX RIA “multimedia web application” is – design the UI with an Artist in Adobe Illustrator and mashup with your old RSS, LINQ, JSON, XML-based Web services, REST and WCF Services to deliver a richer scalable web application.


.NET 2.0 Generics samples & Performance Comparison

January 21, 2005

One of the most awaited features of Microsoft .NET 2.0 is generics. Starting with VS 2005, C#, Managed C++, and VB will have CLR support for generics.

Generics promise to increase type safety, improve performance, reduce code duplication(code reuse) and eliminate unnessecary casts(boxing). The most obvious application of generics in the framework class library are the generic collections in the new System.Collections.Generic namespace.

While Generic types do have a similar syntax to C++ templates, they are instantiated at runtime as opposed to compile time (by Microsoft’s C++ compiler), and they can be reflected on via meta-data. Also, in Generics, member access on the type paramater is verified based on the constraints placed on the type parameter; whereas, in templates, member access is verified on the type argument after instantiation. When the MS C++ compiler creates a separate type for every template specialization, that does not necessarily mean every type emits separate code. In fact, you’ll find that through a feature called COMDAT folding, most templates share quite a bit of code. (Basically, the myth of code bloat for templates isn’t true these days.) By having separate specializations of code at compile-time, the compiler has the ability to optimize each type individually, which includes inlining. If every instance of a function is inlined, the template code is thrown away by the linker resulting in less code. For these reasons, template code is generally much faster and leaner than an equivalent generic alternative.

C++ templates are a compile-time feature much like a macro preprocessor and are thus not a good solution for a highly dynamic language such as C#.

In .NET, for value types, generics is considerably faster, and for reference types, generics is typically comparable in performance—or faster.

Good Links: Introducing Generics in the CLR

Generics Performance Results and a sample to test other collections

Generics Performance (CSharp)

The problem with .NET generics

A generic Set type for .NET

6 questions about generics.

Performance: Interfaces Vs. Inheritance (Abstract Base Classes) Vs. Generics

Thanks to the authors for the above links.


.NET memory and performance improvement

January 17, 2005

Hi,
Now that you have finished your .NET Application, the memory bogs you down?

Limiting memory usage of .NET applications is a requirement that often arises in programs that allocate and use large amounts of memory. The garbage collected environment that the CLR offers means that memory that is used to perform some calculation then discarded is not immediately collected once it is no longer needed, and application memory usage can become quite high in some situations. Rather than wait for all available memory to be exhausted before performing a full garbage collection, there are scenarios where preserving memory for other processes is a higher priority than the raw speed of the memory-intensive .NET application.

Well, there is a COM API RequestVirtualMemLimit to be called after overriding to prevent your application from hogging all the memory and waiting for the last instant for the GC to start freeing off memory. To the CLR, a failed RequestVirtualMemLimit call will appear the same as Windows running out of memory and returning a NULL pointer for a VirtualAlloc request. Rather than simply refusing to allocate any further memory, a gentler and more effective technique is to allow a small memory increase so exception objects can be successfully created, and an OutOfMemory exception can gracefully thrown and handled by managed code. If memory cannot be allocated for exception objects, the runtime will terminate without giving exception handlers a chance to execute, which will rarely be the desired behaviour.

Therefore, to place an effective cap on memory usage, an object implementing IGCHostControl needs to be provided to the runtime.

But the problem, is the “chicken and egg” problem. The ICorConfiguration interface, which is implemented by CorRuntimeHost, has a method called SetGCHostControl that allows an IGCHostControl-implementing object to be provided to the runtime. Unfortunately, it is not possible to retrieve an ICorConfiguration reference after the runtime has started. The QueryInterface logic of CorRuntimeHost fails throws an error when a request for ICorConfiguration is made, and the ICorRuntimeHost::GetConfiguration method, which returns a ICorConfiguration reference, fails when it is called post-startup. When certain hosting functionality is only available before the runtime is started, it is impossible to use the functionality from managed code. Managed code can never execute before the runtime starts, and if the functionality is required, as it is with the memory capping functions, the only option is to explicitly host the runtime using unmanaged code.
Read on at http://www.dotnetperformance.com/downloads/hosting.doc .
Thanks to the author for this insight into unmanaged code advantages in a managed world.

Looking forward to a better managed C++ in .NET 2.0


Cool MS SQL Server Tools

October 15, 2004

Sqldiag – Sqldiag is a utility used for report generation and collection of diagnostic information on database server / operating system configuration parameters. Sqldiag gathers the information, even if Microsoft SQL Server 2000 services are stopped. The report generated by Sqldiag contains the following information:
Complete dump of all SQL Server error logs; Registry information, related to SQL Server; SQL Server system dll versions; Output, generated by: sp_configure, sp_who, sp_lock, sp_helpdb, xp_msver and sp_helpextendedproc; Information about all system processes (master..sysprocesses); Information about all user processes and connections (including Input buffer SPIDs and Dead locks); Information about operating system parameters (including reports about: OS version, Video display, Drivers, DMA, Memory, Services, IRQ and ports, Devices, Environment and Network); Info about the last 100 user’s queries. The Sqldiag utility is installed to the \Program Files\Microsoft SQL Server\MSSQL\Binn directory by default.

Profiler – Profiler is the executable for SQL Server Profiler. SQL Server Profiler is typically used for monitoring SQL Server events, such as debugging T-SQL statements and stored procedures and troubleshooting problems (by capturing them in a real-time and replaying later).

Sqlmaint – Sqlmaint is a maintenance utility. Sqlmaint performs a set of tasks, specified by the DBA, on one or more databases (for example backup databases, update statistics, rebuild indexes, DBCC check).
The Sqlmaint utility is installed to the \Program Files\Microsoft SQL Server\MSSQL\Binn directory by default

bcp – A utility used for the interactive process of bulk coping of data between the SQL Server 2000 instance and data file (format information file should be specified or a default bcp.fmt should be used instead). The bcp utility is the typical example of a “two-way” tool, i.e. copying data “into SQL Server instance” or “out of SQL Server instance” is allowed. Alternatively, bcp can be used for copying data:
Between SQL Server instances with different language collations; To or from a view; Returned from a T-SQL query (to data file); Between the Microsoft SQL Server and database servers of other vendors; Between the SQL Servers working on different processor architectures; To or from a database table (including temporary tables); Between databases within one SQL Server instance. The bcp utility is installed by default to the \Program Files\Microsoft SQL Server\80\Tools\Binn directory

itwiz – itwiz allows the Index Tuning Wizard to be executed from a command prompt. Index tuning using an itwiz is similar to tuning via Index Tuning Wizard with a user interface. The itwiz utility is installed to the \Program Files\Microsoft SQLServer\80\Tools\Binn directory by default.

osql – A utility for interactive Transact-SQL scripts and stored procedures execution. It uses ODBC libraries for communicating with the database server. Osql can be started directly from the operating system command prompt and uses a standard output device (monitor, by default) for displaying results. The osql utility is installed to the \Program Files\Microsoft SQLServer\80\Tools\Binn directory by default.

Simple Enabling/Disabling Constraints/Triggers on the entire SQL 2000 Database
sp_msforeachtable “ALTER TABLE ? NOCHECK CONSTRAINT all”sp_msforeachtable “ALTER TABLE ? DISABLE TRIGGER all”
sp_msforeachtable @command1=”print ‘?'”, @command2=”ALTER TABLE ? CHECK CONSTRAINT all”sp_msforeachtable @command1=”print ‘?'”, @command2=”ALTER TABLE ? ENABLE TRIGGER all”


Simple SQL Server/MSDE Database Installation through osql

October 13, 2004

Here, a MSDE database is dropped, attached, and a user is given rights on the db
With minor changes to the osql parameters a Server name can be provided and this script will work for an SQL 2000 database.

–drop old db
osql -E -S -Q “DROP DATABASE [dbname]”

–copy the mdf to the target loc
copy “c:\installtemp\dbname*.?df” “C:\program Files\Microsoft SQL Server\MSSQL\Data”

–make sure the db file is not read only
attrib -r “C:\program Files\Microsoft SQL Server\MSSQL\Data\dbname*.?df”

–attach the db to the target instance/server
osql -E -S -Q “EXEC sp_attach_db @dbname = ‘dbname’, @filename1 = N’C:\Program Files\Microsoft SQL Server\MSSQL\Data\dbname_Log.LDF’, @filename2 = N’C:\Program Files\Microsoft SQL Server\MSSQL\Data\dbname_Data.MDF'”

use dbname
–add a user to access this db apart from default db admin user
EXEC sp_grantdbaccess “domain\user”, “domain\user”
GO

–grant read access to this user
exec sp_addrolemember N’db_datareader’, “domain\user”
GO

–grant write access to this user
exec sp_addrolemember N’db_datawriter’, “domain\user”
GO

–security script to make sure sps and fns have exec priv.
osql -E -S -Q -i “c:\installtemp\dbnameSecuritySetup.sql”

–Security script is below — dbnameSecuritySetup.sql
–recurively grant exec priv to all sps and fns in db.
USE dbname
DECLARE @sExecQry sysname
DECLARE EXEC_SPS CURSOR LOCAL FOR
select ‘grant exec on ‘ + QUOTENAME(name) + ‘ to “domain\user” ‘ from sysobjects where (type = ‘P’ or type=’FN’) and objectproperty(id,’IsMSShipped’)=0
OPEN EXEC_SPS
FETCH NEXT FROM EXEC_SPS INTO @sExecQry
WHILE @@FETCH_STATUS = 0
BEGIN
EXEC(@sExecQry)
–PRINT @sExecQry — debug only
FETCH NEXT FROM EXEC_SPS INTO @sExecQry
END
CLOSE EXEC_SPS
GO

–Perform checks on the database
DBCC CHECKCONSTRAINTS WITH ALL_CONSTRAINTS
GO

DBCC CHECKDB
GO

DBCC CHECKALLOC
GO

DBCC CONCURRENCYVIOLATION
GO

DBCC DROPCLEANBUFFERS
GO

DBCC FREEPROCCACHE
GO

DBCC UPDATEUSAGE(0)
GO


Migrating Oracle Databases to SQL Server 2000

October 8, 2004

SQL Server 2000 only works on Windows-based platforms, including Windows 9x, Windows NT, Windows 2000 and Windows CE.
In comparison with SQL Server 2000, Oracle 9i Database supports all known platforms, including Windows-based platforms, AIX-Based Systems, Compaq Tru64 UNIX, HP 9000 Series HP-UX, Linux Intel, Sun Solaris and so on.

Sometimes there is the migration issue, so here goes…

There are some nice articles on Oracle to SQL Server migration at

http://www.microsoft.com/resources/documentation/sql/2000/all/reskit/en-us/part2/c0761.mspx

SQL Server vs Oracle Feature differences
http://www.mssqlcity.com/Articles/Compare/sql_server_vs_oracle.htm

There is a nice tool for the Migration of stored procs/SQL at
http://www.swissql.com/oracle-to-sql-server.html

Latest Top Ten TPC-C by PerformanceVersion 5 Results
http://www.tpc.org/tpcc/results/tpcc_perf_results.asp

Top Ten TPC-C by Price/PerformanceVersion 5 Results
http://www.tpc.org/tpcc/results/tpcc_price_perf_results.asp

Oracle vs SQL Server
http://www.dba-oracle.com/oracle_tips_oracle_v_sql_server.htm

Thanks to the authors of these public sites for the relevant information.