Saturday, October 19, 2013

Thursday, July 7, 2011

Like Us on Facebook

We have not been posting here much, but over at facebook instead. You might like to head over there and "like" us...

Thanks !

Monday, February 7, 2011

Next Up... Another Intellect 3.0 Implementation

Next Up: An Intellect 3.0 installation in the Atacama Desert, the driest place on Earth, Northern Chile, complete with performance prediction, optimization and machine operations anomaly detection to sense faults before they become major problems.

Friday, October 29, 2010

Anomaly 3.0

Anomaly 3.0 is about ready for release, destined for its first industrial use in November. Anomaly monitors conditions and determines whether something unusual is happening. Version 3.0 has full data access from multiple sources (PI, SQL Server, OPC, etc.) simultaneously, on-board data pre-processing to create new monitored variables, a suite of univariate and multivariate Anomaly Detectors and can run on the desktop or on-line in real time.

Anomaly is a fundamental element of "Condition Based Maintenance" systems that tell you something is going wrong before damage is done so that maintenance can be done not just on a schedule or after failure has occurred, but just before a problem manifests itself.

Valuable.

Thursday, July 29, 2010

Forging Our Historian in Fire

We are "forging" our historian by creating a personal and commercial form for real-time financial market data. In this scenario, a reasonably large load is put on the historian as hundreds or even 1,000's of transactions per second stream in for each of numerous "tickers" from numerous feeds simultaneously. That's a lot of data. This helps not only our financial customers get and store data for historical analysis, it helps us harden our historian. This is more demanding than most industrial applications, though less so than some high-speed research environments, such as particle accelerators.

A ticker is a stock, fund, currency pair or other security. Many times per second people are bidding and asking to work out a price to buy and sell. In each of these actions, we receive a "Quote". On each exchange, there are literally hoards of traders electronically bidding each millisecond. We get each of these offers in on the wire.

A "Quote" is a complex object with up to 15 associated values, not a single numeric or text value, so each transaction is a block of data values. Some "mainstream" historians might be challenged by this because they are oriented to archiving single values, and thus a "Quote" would have to be split up, disassociated and written to 15 "Tags". Not so with our Historian, which archives Objects and thus the Quote objects themselves, directly.

Now objects take a bit more overhead when stored, a bit more disk space, but in exchange the data is all properly associated, the processing is less and it is faster.

Let the flames roar, as we harden the steel of our Historian.

Wednesday, July 28, 2010

Catch Up

Sorry to have fallen behind here, but business and new product development have been brisk. We are also over on Facebook...

http://www.facebook.com/pages/IntelliDynamics/345638741076


Give us a "Like" if you use Facebook !

So here are some highlights...
  • based on prospective customer request, we are preparing a data managment only solution including data access, synchronization, validation, conditioning, pre-processing and transport in batch and real-time, including visualization.
  • We have successfully installed Intellect 3.0 at a consumer paper products manufacturing facility where they are now doing "cascaded" models, the output of one model feeds as an input to another, integrated through either a SQL database or our on-board object historian.
  • We successfully completed a "Virtual Metering" project for multi-national oil and gas corporation.
  • We held a meeting in South East Asia with a local national oil company regarding field-wide asset management, including modeling, prediction and optimization of a complete oil field.
These are a few highlights.

Saturday, October 10, 2009

Intellect 3.0 Virtual Sensor Auto Recalibration Supported

Intellect 2.0 provided "Tracking Predictors" that self-calibrate to on-line instruments or test results achieving "Class A" (>95% accurate, zero/low maintenance) prediction estimates. A more flexible version is now available in Intellect 3.0. In Intellect 3.0 we made it more flexible by separating the prediction and calibration processes. This enables access to both the raw and calibrated predictions and customization of the re-calibration process.

Adjustments can span across many virtual sensors from one performance measurement process. This is important... Let's say the outputs from multiple unit operations are combined and *then* performance is measured on that aggregate. We can then back-allocate the error from the combined performance to the individual unit operations using an algorithm. For a more tangible example, let's say you are estimating the output of oil from a number of wells and the wells' outputs are summed to a platform level where there is a meter. We can then compute the error at the platform level and back-calibrate the individual wells' production estimates.

This change is now available in the Intellect 3.0 server and also the drag-drop Solution Designer application. This capability will be automatically deployed to customers subscribed to Remote Management Services (RMS).