<html>
<body>
There will never be one perfect solution for all enterprises and
government agencies.<br><br>
The risks are different depending on:<br>
* The nature of the data and software that needs to be protected, from
what kinds of threats, which vary with the industry.<br>
* The computer operating system, computer languages supported, access
methods.<br>
* Just as a lot of software was designed for a long ago reality, when the
needs were less sophisticated, many buildings have security holes ...
false ceilings that a human can travel over, circumventing locked doors,
being the most obvious.<br>
* If a company does not own the building where their offices are located,
the landlord has keys to the place, which may be accessible to a
dishonest employee. Also there may be other businesses in the same
building, with weaker security. Crooks break into the weakest link,
then get through the building into their ultimate target.<br>
* In our interconnected world, other enterprises can connect to our
systems ... some of this is mandated by government regulations, some of
it due to how our business functions. Let's suppose we have given
access to our systems to tech support, consultants, auditors, etc. &
let's suppose that outfit gets penetrated ... can the penetration extend
to all the places they have access to? We know there are viruses
that target e-banking software, so that if we do electronic financial
transfers ... everyone we do business with can be a weak link.<br><br>
However, there can be some standards that cross systems.<br><br>
Some upgrades require temporary relaxing of some security. There
are inspections that should be run after all upgrades, to ensure that
certain security standards are once again in place. They should be
run whether or not the people, doing the upgrades, knowingly relaxed any
standards.<br><br>
In addition to inspection to see if embezzlement going on, there can also
be inspection to see if people are keying sensitive information into data
areas whose labeling is non-sensitive information.<br><br>
It is not enough to train people, and pass out policy manuals.
There has to be a process of testing that the people are following the
rules, such as not to photocopy or fax certain sensitive information, to
have encryption on portable data storage devices that leave company
property, to lock facilities properly every night, promptly report
anything lost or stolen.<br><br>
Testing software changes is done because we expect that something may go
wrong, so the test data base should not contain sensitive data on real
people, but rather data that is a simulation of the data to be tested.
<br><br>
I had suggested in my work place ... the IBM OS tracks software and data
usage ... I can show how heavily we use what ... the auditors can be told
what is used to run our business on a regular basis ... they can
designate 2-3 programs, data sets, etc. to be inspected by a computer
auditor who is an expert on our application systems to produce a report
on what this is really doing, how accurate it is, to be matched with the
external auditors statement of how it has been represented to them by the
end users. Do the two stories match? Depending on the
results, they see how frequent it is wise to pick other such samples in
future audits.<br><br>
I had suggested this due to the multiplicity of PC tools on people
personal work stations & end users divorced from internal logic of
the tools, or software designed by co-workers, and the evolving business,
where we are depending on tools designed years ago, for realities that no
longer exist today.<br><br>
Manny Cho wrote:<br>
<blockquote type=cite class=cite cite=""><font size=2>I agree with
Sanford in that this incident (and all of the other loss notices that
post every day to this site) is indicative of the fact that the idea of
“one solution” or one perfect product is just not a reality today.
</font></blockquote></body>
</html>