T.C. Ridgeway, Inc. (TCRI) is not a NDT
company, but is very familiar with the NDT community that services paper
mills. Our primary function is to assist the mill engineer in
understanding what the lab’s UT inspection data is saying about his
Recovery Boilers. A historical review will shed some light on what part
TCRI plays in the overall process.



During the early 1970's when
the UT inspection of Recovery Boilers started ramping up, computers did
not exist in the general public. To inspect a boiler, the UT technicians
operated as a pair or two-man crew. One technician would place the
transducer on the boiler tube and interpret the oscilloscope to get a
thickness reading, which he verbally called out. The assistant
technician would write the UT readings on a pre-printed form that he had
on a clipboard. The form included the paper company name, mill name,
boiler name, inspection date, boiler section and the elevation being
inspected. It had dozens of little blocks where the UT readings were to
be written. By the end of the inspection, there were hundreds if not
thousands of pages of just numbers that came out of the boiler. Those
dirty pages were then given to the NDT lab's secretary who simply typed
the UT readings onto clean forms, all of which went into a three-ring
binder. That was the UT inspection report the NDT labs gave to the mill
engineers back then. It was nothing more than thousands of pages of raw
numbers.



That type of inspection report presented a major
problem for the mill engineer. His goal was to understand what the data
was saying about his boiler. He soon realized that looking through
thousands of number was nearly an impossible task. Therefore he asked
the NDT labs to use a magic marker, and in the printed report, highlight
the UT readings that he needed to be concerned about. Keep in mind that
computers and color printers did not exist at that time in history. It
was a laborious task, but the labs started producing reports that were
prepared on an electric typewriter and where the readings of concern
were manually "flagged" with a magic marker. That was the standard
report the NDT community supplied for years.



When the first
"desk top" computers arrived in the late 1970's, things started to get
exciting in the UT Reporting field. The goal was to eliminate the
electric typewriter and bring in the computer keyboard. The secretary
would use the technician's dirty UT data sheets and would enter the
readings into a crude software package. Using her colorless dot-matrix
printer, she would be able to print the data as many times as necessary.
However, the report was still just raw numbers and manual flagging with
a magic marker was still required. The NDT community was slowly moving
into the digital reporting world for boiler examinations. Before that
time in history, the only time the term "digital examination" was
mention was in the proctologist's office.



By the time the early
1980's arrived, the three-color dot-matrix printers had appeared and the
"Reporting Race" between the NDT labs was on. While using commercially
available software, each lab would use their computer and color printer
to improve their UT reporting capabilities. The printed Boiler Report
itself became a significant marketing tool. Each lab hoped the mill
engineer (who wanted a good UT report) would give them the boiler job
because of the lab's report. The reports became more than just "raw"
data. They started to become "informational" data. Things like multiple
colors and the average UT thickness per elevation were being included in
the ever-encompassing reports. Because of the vast quantity of UT
readings, the labs soon realized the off-the-shelf programs like Lotus
and Excel did not have the capabilities needed to further enhance UT
data reporting. Therefore, programmers were hired and given the task to
create the most capable and most impressive color report the labs could
offer to their clients. The race was on and this played out a numbers of
times by many labs. The resulting programs were all different
internally and written in different software languages. These programs
did specific task very well and produced very good boiler UT inspection
reports. The computer world calls them "Expert Systems". Keep in mind,
these were still just "data reporting" software packages. The
technicians were still writing the data on paper and it was then
manually entered into the program by the office secretary or computer
staff.



The mid, to late 1980's brought in the UT data loggers.
They were supposed to make things simpler and easier. Just the opposite
was true! The level of complexity to conduct the UT inspection went up
dramatically. Because the software that came with the logger was not
satisfactory, the labs modified their own "data reporting" software to
become "data acquisition" software as well. This meant the individual
Expert Systems started talking directly to the loggers. The labs were
then totally in the new digital world and storing data on hard drives
somewhere.



In 1985, I speculated what the mill engineer would
want next and I saw a train wreck in the making. It made sense to do
something analytical with all of the UT data that was being stored on
the lab's hard drives. As specified in Volume 1 of the "American Paper
Institute, Recovery Boiler Reference Manual", the mill engineer would
want to mathematically trend the UT data. Trending gives him a rate of
metal loss and therefore he can forecast the future condition of the
boiler. A mill engineer's dream. Unfortunately, he still had to deal
with the mill's purchasing agent and at times was forced to use
different NDT labs to conduct the boiler inspections. That means the UT
data was going to be coming to the mill engineer from different Expert
Systems over the many years that are needed to accurately trend data. As
previously mentioned, each Expert System had its own file structure and
by design did not talk with a competing system. This was a
real problem for the mill engineer. He had a lot of UT data from
different systems, none of which will work with data from a competing
system. I saw there was a need for a "Standard" file structure. The mill
engineers could ask the labs to modify their software such that it
produces data files that comply with the standard. In order for the labs
to agree, the standard would need to be developed and published by a
third party that was not a competing NDT lab. Thus, TCRI was created.



TCRI
came onto the scene in early 1985 and was not the type of company most
people thought a software company would be. We only did one thing and
that was to be at the boiler shutdown to qualify the UT data as it came
out of the boiler. By doing so, it could immediately be analyzed by our
software, triggering boiler maintenance actions if necessary. Because
analytical software for boiler data did not exist at the time, we had to
develop and write our own from scratch. After establishing a workable
data file structure, we wrote our own Expert System and referred to it
as the "Technical Database System" or the TDS. The file structure became
the "Standard" and was given to the NDT labs so they could produce TDS
type files from their already existing systems. At the request of their
paper mill clients, the labs begrudgingly modified their software to
produce TDS type UT data files. The TDS as well as the lab's software
have evolved over the years.



The TDS does four primary
functions. First, it is a "data acquisition" system. That means it
creates UT work assignments and transfers those assignments to the
logger. After the technician performs the inspection in the boiler, the
assignment that now contains UT data is transferred back to the TDS. The
second function of the TDS is that it is a "data reporting" system.
That means it produces large multi-color prints of the UT data in a
variety of ways. The prints are referred to as "Static Images" and
pertain to a specific inspection date. The third function of the TDS is
that it is a "data analysis" system. That means that hundreds of
thousands of UT readings from many inspections by different labs, all
come together precisely to yield accurate mathematical trending of tube
metal loss. Every individual UT location on the boiler tubes has a rate
of metal loss calculated. Once the data is in the system, the advanced
calculations and projections happen in just seconds. It truly empowers
the mill engineer when it comes to the understanding of his boiler. The
fourth function of the TDS is for it to be a "data archiving" system
that stores all of the data in a manner that enables immediate access by
the end user. That means the engineer does not need to know the details
of how the software works or in which folders the UT data is stored. He
answers several simple questions in boiler terms and he is
automatically taken to the data he is looking for. Once there, he can
slice and dice the data in a variety of ways with different software
routines. These routines are not just bells and whistles, but are real
engineering and analytical tools. The system also deals with the data
that originates from the Mud Drum inspection probes. The probes
automatically capture millions of UT readings. When the lab finishes the
inspection inside the Mud Drum, they give us the data so the TDS can
display 3D images with it. The TDS almost sounds like science fiction.
It is real, and is the results of a 6.4 million dollar and twenty-five
year investment. The labs have also spent vast amounts of money
perfecting their own expert systems. The latest boiler reporting system
being developed by an NDT lab, is the product of five years of software
research and a four million dollar investment. By definition, the
software packages that were designed for UT inspections of recovery
boilers are very expensive to develop and maintain.



So much for
the historical prospective. I wrote it so you have an understanding of
where things are today concerning the computer side of the Recovery
boiler UT inspections. The question now is how does the NDT lab get
their data into the TDS format. That is where the TCRI Compatibility
document comes into play. The document explains the Standard file
structure and what the labs need to do to get the data in the mill's
database.


Tom Ridgeway