Intelligence cycle security
Intelligence cycle security comprises the set of disciplines used to protect national intelligence programs, and, by extension, the overall defenses of nations, from hostile countermeasures and unauthorized information access. It is the role of intelligence cycle security to protect the process embodied in the intelligence cycle, and that which it defends.
A number of disciplines go into protecting the intelligence cycle. One of the challenges is there are a wide range of potential threats, so threat assessment, if complete, is a complex task. Governments try to protect three things:
- Their intelligence personnel
- Their intelligence facilities and resources
- Their intelligence operations
Defending the overall intelligence program, at a minimum, means taking actions to counter the major disciplines of intelligence collection techniques:
To these are added at least one complementary discipline, Counterintelligence (CI) which, besides defending the six above, can itself produce positive intelligence. Much, but not all, of what it produces is from special cases of HUMINT.
Also complementing intelligence collection are additional protective disciplines, which are unlikely to produce intelligence:
These disciplines, along with CI, form intelligence cycle security, which, in turn, is part of intelligence cycle management. Disciplines involved in "positive security", or measures by which one's own society collects information on its actual or potential security, complement security. For example, when communications intelligence identifies a particular radio transmitter as one used only by a particular country, detecting that transmitter inside one's own country suggests the presence of a spy that counterintelligence should target.
CI refers to efforts made by intelligence organizations to prevent hostile or enemy intelligence organizations from successfully gathering and collecting intelligence against them. Frank Wisner, a well-known CIA operations executive said of the autobiography of Director of Central Intelligence Allen W. Dulles , that Dulles "disposes of the popular misconception that counterintelligence is essentially a negative and responsive activity, that it moves only or chiefly in reaction to situations thrust upon it and in counter to initiatives mounted by the opposition" Rather, he sees that can be most effective, both in information gathering and protecting friendly intelligence services, when it creatively but vigorously attacks the "structure and personnel of hostile intelligence services." In 1991  and 1995 US Army manuals dealing with counterintelligence , CI had a broader scope against the then-major intelligence collection disciplines. While MASINT was defined as a formal discipline in 1986 , it was sufficiently specialized not to be discussed in general counterintelligence documents of the next few years.
All US departments and agencies with intelligence functions are responsible for their own security abroad. 
In many governments, the responsibility for protecting intelligence and military services is split. Historically, CIA assigned responsibility for protecting its personnel and operations to its Office of Security, while it assigned the security of operations to multiple groups within the Directorate of Operation: the counterintelligence staff and the area (or functional) unit, such as Soviet Russia Division. At one point, the counterintelligence unit operated quite autonomously, under the direction of James Jesus Angleton. Later, operational divisions had subordinate counterintelligence branches, as well as a smaller central counterintelligence staff. Aldrich Ames was in the Counterintelligence Branch of Europe Division, where he was responsible for directing the analysis of Soviet intelligence operations. US military services have had a similar and even more complex split.
Some of the overarching CI tasks are described as
- Developing, maintaining, and disseminating multidiscipline threat data and intelligence files on organizations, locations, and individuals of CI interest. This includes insurgent and terrorist infrastructure and individuals who can assist in the CI mission.
- Educating personnel in all fields of security. A component of this is the multidiscipline threat briefing. Briefings can and should be tailored, both in scope and classification level. Briefings could then be used to familiarize supported commands with the nature of the multidiscipline threat posed against the command or activity.
Changes in doctrine for protecting the entire intelligence cycle?
The US definition of counterintelligence, however, is narrowing, while the definition of Operations Security (OPSEC) seems to be broadening. The manuals of the early 1990s describedCI as responsible for overall detection of, and protection from, threats to the intelligence cycle. With the 2005-2007 National Counterintelligence Strategy statements, it is no longer clear what function is responsible for the overall protection of the intelligence cycle. In this recent US doctrine, although not necessarily that of other countries, counterintelligence (CI) is now seen as primarily a counter the Human Intelligence HUMINT to Foreign Intelligence Service (FIS). FIS is a term of art that covers both nations and non-national groups, the latter including terrorists, or organized crime involved in areas that are considered fundamental threats to national security
The National Counterintelligence Strategy of 2005 states the CI mission as:
- counter terrorist operations
- seize advantage:
- protect critical defense technology
- defeat foreign denial and deception
- level the economic playing field,
- inform national security decisionmaking
- build a national CI system.
The 2007 US joint intelligence doctrine restricts its primary scope to counter-HUMINT, which usually includes counter-terror. It is not always clear, under this doctrine, who is responsible for all intelligence collection threats against a military or other resource. The full scope of US military counterintelligence doctrine has been moved to a classified publication, Joint Publication (JP) 2-01.2, Counterintelligence and Human Intelligence Support to Joint Operations, so it cannot be known if that problem is clarified there.
Countermeasures to Specific Collection Disciplines
More specific countermeasures against intelligence collection disciplines are listed below
|Discipline||Offensive CI||Defensive CI|
|HUMINT||Counterreconnaissance, offensive counterespionage||Deception in operations security|
|SIGINT||Recommendations for kinetic and electronic attack||Radio OPSEC, use of secure telephones, SIGSEC, deception|
|IMINT||Recommendations for kinetic and electronic attack||Deception, OPSEC countermeasures, deception (decoys, camouflage)
If accessible, use SATRAN reports of satellites overhead to hide or stop activities while being viewed
See additional detail on Project Slammer, which was an effort of the Intelligence Community Staff, under the Director of Central Intelligence, to come up with characteristics of Project Slammer, an Intelligence Community sponsored study of espionage.
Aspects of physical security, such as guard posts and wandering guards that challenge unidentified persons, certainly help with counter-HUMINT. Security education, also part of OPSEC, is important to this effort.
Military and security organizations will provide secure communications, and may monitor less secure systems, such as commercial telephones or general Internet connections, to detect inappropriate information being passed through them. Education on the need to use secure communications, and instruction on using them properly so that they do not become vulnerable to specialized technical interception. Methods of including encryption and traffic flow security may be needed in addition to, or instead of, specialized shielding of the equipment.
The range of methods possible in counter-SIGINT cover a wide range of what is appropriate in a combat zone to what can be done in a research laboratory. At the combat end, if an enemy SIGINT interception antenna can be targeted, thoroughly bombing it and its associated electronics will definitely end its career as a SIGINT threat. In slight less dramatic fashion, it can be taken under electronic attack, and strong enough electromagnetic signals directed at it to conceal any friendly signals, and perhaps to overload and destroy the electronics of the SIGINT facility. SIGINT protection for office buildings may require a room to be electronically shielded.
The basic methods of countering IMINT are to know when the opponent will use imaging against one's own side, and interfering with the taking of images. In some situations, especially in free societies, it must be accepted that public buildings may always be subject to photography or other techniques.
Countermeasures include putting visual shielding over sensitive targets or camouflaging them. When countering such threats as imaging satellites, awareness of the orbits can guide security personnel to stop an activity, or perhaps cover the sensitive parts, when the satellite is overhead. This also applies to imaging on aircraft and UAVs, although the more direct expedient of shooting them down, or attacking their launch and support area, is an option in wartime.
While the concept well precedes the recognition of a discipline of OSINT, the idea of censorship of material directly relevant to national security is a basic OSINT defense. In democratic societies, even in wartime, censorship must be watched carefully lest it violate reasonable freedom of the press, but the balance is set differently in different countries and at different times.
Britain is generally considered to have a very free press, but the UK does have the DA-Notice, formerly D-notice system. Many British journalists find that this system is used fairly, although there always be arguments. In the specific context of counterintelligence, note that Peter Wright, a former senior member of the Security Service who left their service without his pension, moved to Australia before publishing his book Spycatcher. While much of the book was reasonable commentary, it did reveal some specific and sensitive techniques, such as Operation RAFTER, a means of detecting the existence and setting of radio receivers.
Even though the principles of OPSEC go back to the beginning of warfare, formalizing OPSEC as a US doctrine began with a 1965 study called PURPLE DRAGON, ordered by the Joint Chiefs of Staff, to determine how the North Vietnamese could get early warning of ROLLING THUNDER fighter-bomber strikes against the North, and ARC LIGHT B-52 missions against the South. 
The methodology used was to consider what information the adversary would need to know in order to thwart the flights and the sources from which the adversary might collect this information. In the case of ARC LIGHT missions, for example, a routine warning message to friendly troops alerted opposing personnel that B-52's were on the way. Prior to the use of artillery in a given area, a radio broadcast was sent on friendly channels, warning of the impending use of medium or heavy artillery, or air strikes, near certain grid coordinates. Only in the case of an ARC LIGHT mission, however, was the phrase "very heavy artillery" used.
It became apparent to the team that although traditional security and intelligence countermeasures programs existed, reliance solely upon them was insufficient to deny critical information to the enemy--especially information and indicators relating to intentions and capabilities. The group conceived and developed the methodology of analyzing U.S. operations from an adversarial viewpoint to find out how the information was obtained.
The team then recommended corrective actions to local commanders. They were successful in what they did, and to name what they had done, they coined the term "operations security."
Establishment within DOD
After the Vietnam War ended in 1973 the JCS adopted the OPSEC instruction developed by the CINCPAC OPSEC branch and published it as JCS Publication 18, “Doctrine for Operations Security”. Originally classified, an unclassified version was published the following year. The JCS published the first JCS OPSEC Survey Planning Guide, and distributed this publicationwithin DOD and to other Federal agencies.
The work was further publicized with annual OPSEC conferences for representatives throughout government. Attendees discussed ways to adapt the OPSEC concept developed for combat operations to the peacetime environment. Throughout the late 1970s, the military services established their own OPSEC programs and published implementing directives and regulations. Somewhat to the surprise of the military, some civilian organizations, which dealt with classified information, had developed very good OPSEC procedures.
Establishment outside DOD
After the Vietnam War, a number of the individuals who had been involved in the development or application of the OPSEC concept were either already working for, or went to work with the National Security Agency (NSA). As a result, NSA played a major role in adapting OPSEC to peacetime operations, and in informally communicating OPSEC concepts to other agencies. Thus, non-DoD agencies began to establish their own programs, and OPSEC was on its way to becoming a national program.
Establishment within DOE
The U.S. Department of Energy, which had responsibility for nuclear weapons research and production, began to assume a role in the peacetime application of OPSEC by participating in the OPSEC conferences and interfacing with other Federal agencies. In 1980, DOE began establishing its own OPSEC program and, by 1983, the Department had the only formal OPSEC program outside DOD. Since that time, DOE has continued to refine and adapt the OPSEC concept to meet the specific needs of its mission.
In 1988, President Ronald Reagan issued National Security Decision Directive 298 (NSDD-298)
that established a national policy and outlined the OPSEC five-step process. Also mandated
within NSDD-298, was the establishment of the Interagency OPSEC Support Staff (IOSS). The
IOSS mission is to assist in implementing the national-level OPSEC program as directed by the
President. The NSDD directs IOSS to provide or facilitate OPSEC training and act as a
consultant to Executive departments and agencies required to have OPSEC programs.
Operations security (OPSEC), in a widely accepted meaningCite error: Invalid
<ref> tag; invalid names, e.g. too many, relates to identifying the information that is most critical to protect regarding future operations, and planning activities to:
- Identifying those actions that can be observed by adversary intelligence systems
- Determining indicators that adversary intelligence systems might obtain that could be interpreted or pieced together to derive critical information in time to be useful to adversaries
- Designing and executing measures that eliminate or reduce to an acceptable level the vulnerabilities of friendly actions to adversary exploitation.
Contrary to the US Department of Defense definition, a 2007 webpage of the US Intelligence Board  describes (emphasis added) "the National Operations Security (OPSEC) Program - a means to identify, control, and protect unclassified information and evidence associated with U.S. national security programs and activities. If not protected, such information often provides an opportunity for exploitation by adversaries or competitors working against the interests of the US."
Even though there are legitimate concerns about adversaries exploting open societies, there must be vigilance that a broader definition of OPSEC has created concerns that national security may be used to conceal the politically embarrassing or marginally legal, rather than protecting legitimate functions.
- Identification of the critical information to be protected: Basic to the OPSEC process is determining what information, if available to one or more adversaries, would harm an organization's ability to effectively carry out the operation or activity. This critical information constitutes the "core secrets" of the organization, i.e., the few nuggets of information that are central to the organization's mission or the specific activity. Critical information usually is, or should be, classified or least protected as sensitive unclassified information.
- Analysis of the threats: Knowing who the adversaries are and what information they require to meet their objectives is essential in determining what information is truly critical to an organization's mission effectiveness. In any given situation, there is likely to be more than one adversary and each may be interested in different types of information. The adversary's ability to collect, process, analyze, and use information, i.e., the threat, must also be determined.
- Analysis of the vulnerabilities: Determining the organization's vulnerabilities involves systems analysis of how the operation or activity is actually conducted by the organization. The organization and the activity must be viewed as the adversaries will view it, thereby providing the basis for understanding how the organization really operates and what are the true, rather than the hypothetical, vulnerabilities.
- Assessment of the risks: Vulnerabilities and specific threats must be matched. Where the vulnerabilities are great and the adversary threat is evident, the risk of adversary exploitation is expected. Therefore, a high priority for protection needs to be assigned and corrective action taken. Where the vulnerability is slight and the adversary has a marginal collection capability, the priority should be low
- Application of the countermeasures: Countermeasures need to be developed that eliminate the vulnerabilities, threats, or utility of the information to the adversaries. The possible countermeasures should include alternatives that may vary in effectiveness, feasibility, and cost. Countermeasures may include anything that is likely to work in a particular situation. The decision of whether to implement countermeasures must be based on cost/benefit analysis and an evaluation of the overall program objectives.
OPSEC complements (emphasis in (NSDD 298)) physical, information, personnel, computer, signals, communications, and electronic security measures. Note that this does not include counter-HUMINT or counter-IMINT. If the definition of counterintelligence is redefined to cover counter-HUMINT, a scope begins to emerge, although an official term still seems lacking to deal with the totality of security against all threats. This series of articles simply tries to define it for the security of intelligence.
Another way of describing the scope was coined, by DOE, as the "Laws of OPSEC":
- 1. If you don't know the threat, how do you know what to protect? Although specific threats may vary from site to site or program to program. Employees must be aware of the actual and postulated threats. In any given situation, there is likely to be more than one adversary, although each may be interested in different information.
- 2. If you don't know what to protect, how do you know you are protecting it? The "what" is the critical and sensitive, or target, information that adversaries require to meet their objectives.
- 3. If you are not protecting it (the critical and sensitive information), the adversary wins! OPSEC vulnerability assessments, (referred to as "OPSEC assessments" - OA's - or sometimes as "Surveys") are conducted to determine whether or not critical information is vulnerable to exploitation. An OA is a critical analysis of "what we do" and "how we do it" from the perspective of an adversary. Internal procedures and information sources are also reviewed to determine whether there is an inadvertent release of sensitive information
OPSEC measures may include, but are not limited to, counterimagery, cover, concealment, and deception.
The Director, National Security Agency, was designated Executive Agent for interagency OPSEC training. This role meant assisting Executive departments and agencies, as needed, to establish OPSEC programs; providing interagency OPSEC training courses; and run an Interagency OPSEC Support Staff (IOSS), whose membership shall include, at a minimum, a representative of the Department of Defense, the Department of Energy, the Central Intelligence Agency, the Federal Bureau of Investigation, and the General Services Administration. The IOSS was to:
- Carry out interagency, national level OPSEC training for executives, program managers, and OPSEC specialists;
- Act as a consultant to Executive departments and agencies in connection with the establishment of OPSEC programs and OPSEC surveys and analyses; and
- Provide an OPSEC technical staff for the appropriate policy committee.
Nothing in this directive:
- Is intended to infringe on the authorities and responsibilities of the Director of Central Intelligence to protect intelligence sources and methods, nor those of any member of the Intelligence Community as specified in Executive Order No. 12333; or
- Implies an authority on the part of the SIG-I Interagency Group for Countermeasures (Policy) or the NOAC to examine the facilities or operations of any Executive department or agency without the approval of the head of such Executive department or agency.
It should be noted that NSDD 298 preceded the creation of a Director of National Intelligence (DNI), who assumed some responsibilities previously held by the Director of Central Intelligence (DCI). The Director of Central Intelligence was no longer the head of the intelligence commuity, and was retitled the Director of the Central Intelligence Agency (DCIA). The National Counterintelligence Executive (NCIX), reporting directly to the DNI, also was created after NSDD 298. It is not clear, therefore, if the italicized responsibilities (above) of the DCI may have moved to the DNI or NCIX.
Communications security forms an essential part of counterintelligence, preventing an adversary to intercept sensitive information that is transmitted, especially through free space, but also through wired networks capable of being wiretapped. It includes several disciplines, both including those for protecting the hostile acquisition of information either from the patterns of flow of messages, or the content of messages (e.g., encryption. It also includes a number of disciplines that protect against the inadvertent emanation of unprotected information from communications systems.
This article discusses physical security in the context of information cycle security; see Physical security for a more general view of the topic. Protection of both sensitive information in human-readable form, as well as of cryptographic equipment and keys, is the complement of communications security. The strongest cryptography in the world cannot protect information that, when not being sent through strong cryptographic channels, is left in an area where it can be copied or stolen.
It's useful to look at some of the extremes of physical security, to have a standard of reference. A US Sensitive Compartmented Intelligence Facility (SCIF) is an room or set of rooms, with stringent construction standards,to contain the most sensitive materials, and where discussions of any security level can take place. A SCIF is not a bunker, and might slow down, but certainly not stop, a determined entry attempt that used explosives.
There can be individual variations in construction based on specific conditions; the standards for one inside a secure building inside a military base in the continental US are not quite as stringent as one in an office building in a hostile country. While, in the past, it was assumed that the SCIF itself and a good deal of electronics in it needed specialized and expensive shielding, and other methods, to protect against eavesdropping -- most of these are under the unclassified code word TEMPEST, TEMPEST has generally been waived for facilities in the continental US. At the other extreme, the secure conference and work area in the US Embassy in Moscow, where the new building was full of electronic bugs, had extra security measures making it a "bubble" inside as secure a room as was possible. In each case, however, the plans must be preapproved by a qualified security expert, and the SCIF inspected before use, and periodically reinspected.
Facilities for less sensitive material do not need the physical protection provided by a SCIF. While the most recent policies should be consulted, TOP SECRET needs both an alarmed (with response) high-security filing cabinet with a combination lock; SECRET may relax the alarm requirement but still require the same high-security filing cabinet; CONFIDENTIAL may accept a regular filing cabinet, which has had a bracket welded to each drawer, and a strong steel bar run through the brackets and secured with a combination padlock.
Other security measures might be closing drapes or blinds over the window of a room where classified information is handled, and might be photographed by an adversary at a height comparable to that of the window.
No matter how physically secure a SCIF may be, it is no more secure than the people with access to it. One severe security breach was by a clerk, Christopher John Boyce, who worked inside the SCIF that held communications equipment and stored documents for a TRW facility in Redondo Beach, California. At this facility, among other sensitive programs, TRW built US reconnaissance satellites for the National Reconnaissance Office and Central Intelligence Agency . Boyce stole documents and gave them to a drug dealer, Andrew Daulton Lee, who sold them to the Soviet KGB. Personnel security measures are as important as physical security. Counter-HUMINT measures might have detected the documents being transferred and taken to the Soviet Embassy in Mexico City.
Many of the greatest enemy penetrations of a country's secrets came from people who were already trusted by that government. Logically, if your service desires to find out your opponent's secrets, you do not try to recruit a person that would have no access to those secrets -- unless the person you recruit gives you access to such a person.
The general process of determining whether a person can be trusted is security clearance; the British term, used in many Commonwealth countries, is positive vetting. In the most sensitive positions, clearances are periodically renewed. Indications of possible compromise, such as spending patterns that are inconsistent with one's known income, are supposed to be reported to personnel security officers.
Security clearance is a starting point, but, while respecting individual rights, a certain amount of monitoring is appropriate. Some of the worst US betrayals, of recent years, have been money-driven, as with Aldrich Ames and Robert Hanssen. Electronic reviews of financial records, raising flags if there are irregularities that need to be checked, may be among the least obtrusive means of checking for unexplained income. There are cases where such checks simply revealed an inheritance, and no further questions were raised.
Detecting other vulnerabilities may come from things that are simply good management. If someone is constantly clashing with colleagues, it is good people management, not uniquely security management, for a manager to talk with the individual, or have them talk to an Employee Assistance Program. It is not considered a disqualification to seek mental health assistance; it may be considered excellent judgment. For people with especially high security access, it may be requested that they see a therapist from a list of professionals with security clearance, so there is not a problem if something sensitive slips or they need to discuss a conflict.
- Dulles, Allen W. (1977). The Craft of Intelligence. Greenwood. Dulles-1977.
- Wisner, Frank G. (22 Sep 1993). On "The Craft of Intelligence".
- US Department of the Army (15 January 1991). Field Manual 34-37: Echelons above Corps (EAC) Intelligence and Electronic Warfare (IEW) Operations.
- US Department of the Army (3 October 1995). Field Manual 34-60: Counterintelligence.
- Interagency OPSEC Support Staff (IOSS) (May 1996). Operations Security Intelligence Threat Handbook: Section 2, Intelligence Collection Activities and Disciplines.
- US Army (May 2004). Chapter 9: Measurement and Signals Intelligence. Field Manual 2-0, Intelligence. Department of the Army.
- Matschulat, Austin B. (2 July 1996). Coordination and Cooperation in Counerintelligence.
- Van Cleave, Michelle K. (April 2007). Counterintelligence and National Strategy. School for National Security Executive Education, National Defense University (NDU).
- Joint Chiefs of Staff (22 June 2007). Joint Publication 2-0: Intelligence.
- History of OPSEC.
- US Intelligence Board (2007), Planning and Direction, USIB 2007
- National Security Decision Directive 298: National Operations Security Program (July 15, 2003).
- Department of Energy. An Operational Security (OPSEC) Primer.
- Director of Central Intelligence (18 November 2002). Physical Security Standards for Sensitive Compartmented Information Facilities.
- Lindsey, Robert (1979). The Falcon and the Snowman: A True Story of Friendship and Espionage. Lyons Press.