Sign In to Follow Application
View All Documents & Correspondence

"Aggregating The Knowledge Base Of Computer Systems To Proactively Protect A Computer From Malware"

Abstract: In accordance vvilh the present invention, a system, method, and computer-readable medium for aggregating the knowledge base of a plurality of security services or other event collection systems to protect a computer from malware is provided. One aspect of the pivseui invention is a method thai proactively protects a computer from malware. More specifically, she method comprises: using aivti-malware services or other event collection systems to observe suspicious events that are potentially indicative of malware; determining if the suspicious evenls satisfy a predetermined threshold; and if (.lie suspicious events satisfy the predetermined threshold, implementing a restrictive security policy designed to prevent the spread ul malware

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 February 2006
Publication Number
32/2007
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2019-08-26
Renewal Date

Applicants

MICROSOFT CORPORATION
ONE MICROSOFT WAY, REDMOND, WASHINGTON 98052, UNITED STATES OF AMERICA.

Inventors

1. ANIL FRANCIS THOMAS
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.
2. EFIM HUDIS
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.
3. MICHAEL KRAMER
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.
4. MIHAI COSTEA
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.
5. PRADEEP BAHL
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.
6. RAJESH K. DADHIA
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.
7. YIGAL EDERY
ONE MICROSOFT WAY, REDMOND, WA 98052, UNITED STATES OF AMERICA.

Specification

A( i( IREGATINC.i THE KNOWLEDGE BASE OF COMPUTER SYSTEMS TO I'ROACTIVKLY PROTECT A COMPUTER FROM MAEWARE
FIELD OF THE INVENTION
The present invention relates to computers and, more particularly, to dynamically protecting a computer from malware.
BACKGROUND OF THE INVENTION
As more and more computers and other computing devices are interconnected through various networks, such as the Internet, computer security has become increasingly more important, particularly from invasions or attacks delivered over a network or over an information stream. As those skilled in the art and others will recognize, these attacks come in main tliflcrcnl forms, including, bul certainly not limited to, computer viruses, computer worms, system component, replacements, denial of service attacks, theft of information, even misuse/abuse of legitimate computer system features—all of which exploit one or more computer system vulnerabilities for illegitimate purposes. While those skilled in the art will reali/e that the various computer attacks are technically distinct from one another, for purposes of the present invention and for simplicity in description, all of these attaeks will be generally referred to hereafter as computer malware, or more simply, malware.
When a computer system is attacked or "infected" by a computer malware, the adverse results are varied, including disabling system devices; erasing or corrupting firmware, applications, or data files; transmitting potentially sensitive data to another location on ihe network; shutting down the computer system; or causing the computer system
id crash Yet anolhei pernicious aspect of many, though not all, computer nuilvvare is lhal an mlcctoci computer system is used to infect other computer systems.
FKiMRKI is a piclorial diagram illustrating an exemplary networked eiiviri)nmenl 100 over which a computer malvvarc is commonly distributed. As shown in FKrliRi'. 1. the typical exemplary networked environment 100 includes a plurality of computers 102-1 OX: all inlciconnected via a communication network 110, such as an inlrancl, or via a larger communication network, including the global TCP/IP network commonly re I erred to as the Internet. For whatever reason, a malicious parly on a computer connected 10 ihe network I 10, such as computer 102, develops a computer malware I 12 and releases i! on the network 1 10. The released computer malware 112 is received by and infects one or more computers, such as computer 104, as indicated by arrow 114. As is typical with many computer malware, once infected, computer 104 is used to infect other computers, such as computer 106, as indicated by arrow 1 16, which in turn, infects yet other computers, such as computer IDS, as indicated by arrow 118.
As anlivirus software has become more sophisticated and efficient at recogni/.ing thousands of known computer malware, so too have the computer malware become more sophisticated. For example, many recent computer malware are now polymorphic or, in other uords, they have no identiliable pattern or "signature" by which they can be recogni/.ed by antiviru.s software in transit. These polymorphic malware are frequently unrecognizable by anlivirus software because they modify themselves before propagating to another computer system.
As vulnerabilities are identified and addressed in an operating system or other computer s\stem components, such as device drivers and software applications, the operating .system provider will typically release a software update to remedy the vulnerability. These updates, frequently referred to as patches, should be installed on a computer system in order to secure the computer system from the identified vulnerabilities. However, these updates are, in essence, code changes to components of the operating system, device drivers, or software applications. As such, they cannot be released as rapidly and freely as antivirus updates from antivirus software providers. Because these updates are code changes, the softwaie updates require substantial in-house testing prior to being released to the public.
I'ndci Ilie prcscnl sv.stem of identifying malvvarc and addressing vulnerabilities, computers .ne susceptible to being attacked by inalwure in certain circumstances. For example, a computer user may not install patches and/or updates lo antivirus software, la this instance, malware may propagate on a network between computers that have not been adequately protected against the malware. However, even when a user regularly updates a computer, there is a period of lime, referred to hereafter as a vulnerability window, that exists between when a neu computer malware is released on the network and when antivirus soil ware on an operating system component may be updated to protect the computer .system from the malware. As the name suggests, it is during this vulnerability window that a compute! svslcm is vulnerable or exposed, to the new computer malware.
I'Kil ;l\|{ 2 is .1 block diagram of an exemplary timeline that illustrates a vulnerability window In regard 10 the following discussion, significant limes or events will be identified and rcfcried lo as events in regard to a timeline. While most malware released today are-based on known vulnerabililies, occasionally, a computer malware is released on the network IK) thai lakes advantage of a previously unknown vulnerability. FIGURK2 illustrates a vulnerability window 204 with regard to a timeline 200 under this scenario. Thus, as shown on the timeline 200, at event 202, a malware author releases a new computer malware. As this is a new computer malware, there is neither an operating system patch nor an aniivnus update available to protect vulnerable computer systems from the malware. ('orre.spondmgK, the vulnerability window 204 is opened.
At some point alter the new computer malware is circulating on the network I 10, the operating sv.slcm provider and/or the antivirus software provider detect the new computer malware. as indicated by eveni 206. As those skilled in the art will appreciate, typically, the presence of ihe new computer malware is detected within a matter of hours by both the operating sysiem provider and the antivirus software provider.
(>nce the computer malware is detected, the antivirus software provider can begin its process lo identify a pattern or "signature" by which the antivirus software may recognize the computer malware. Similarly, the operating system provider begins its process lo analyze the eompuict malware lo delermine whether the operating system must be patched to protect it Irom the compnlci malware. As a result of these parallel efforts, at event 208, the
operating system provider and/or the antivirus software provider releases an update, i.e., a sollwarc patch id the opeiating system or antivirus software, which addresses the computer malvvate. Subsequently, at event 210, the update is installed on a user's computer system, thereby pioiecting ihe eomputer system and bringing the vulnerability window 204 to a close.
As can be seen iiom the examples described above—which is only representative of all of the po-.sihie scenarios in which computer malware pose security threats to a computer system- a \ulnerabiiiiy window 204 exists between the times that a computer malware 1 12 is released on a network 1 10 and when a corresponding update is installed on a user's computer system. Sadly, whether the vulnerability window 104 is large or small, an infected computer costs Ihe computer's owner substantial amounts of money to "disinfect" and repair. This cost can be enormous when dealing with large corporations or entities that may have thousands or hundreds of thousands of devices attached to the network 110. Such a cost is further amplified In ihe possibility that the malware may tamper or destroy user data, all of which may be extremely difficult or impossible to trace and remedy.
To counter (he threats presented by malware, an increasing number of anti-malware services and other event detection systems have been developed to monitor entry points and/or data streams for different types of malware. For example, in the context of anti-malware services, many computers now employ firewalls, behavior blockers, and anti-spyware systems to protect a computer in addition to traditional antivirus software. Those skilled in the art and others will recognize that anti-malware services are typically capable of identifying (I) code and/or activities that are known to be characteristic of malware, and (2) code and or activities that are "suspicious" or potentially characteristic of malware. When code and/or activities that are known to be characteristic of malware are identified, a malware handling routine will be used to "disinfect" or remove the malware from the ciimpuler. However, in instances when code and/or activilies arc identified that are suspicious, ihe anti-malware services may not. have enough information to declare, with sufficient acc.uracy, ihat the code and/or activities are actually characteristic of malware. Moreover, oilier evenl detection systems have been developed to monitor entry points, data streams, computer aluibutes and/or activities, for a variety of number ol different purposes.
I''or example, some operating systems track the amount of processing performed by a Central I'roccssing l:nit ('"('I'll"), as well as certain significant "events" related to a compuler that may be nselul when proaotively protecting a computer from malware.
SI IMMARY OF TUB INVHNTION
I he lorcgoing problems with the state of the prior art are overcome by the principles ol the present invention, which are directed toward a system, method, and computer-readable medium lor aggregating the knowledge base of a plurality of anti-malware services and other event detection systems to proaclively protect a computer from malware.
One aspect ol the present invention is a method for protecting a stand-alone computer thai maintains a plurality ol anti-malware services and/or event detection systems from maluarc. More specifically, the method comprises (1) using the anti-rnalwaie services and/or eveni detection systems to observe suspicious events that are potentially indicative of malware; (2) determining if the suspicious events satisfy a predetermined threshold; and (3)il the suspicious events satisfy the predetermined threshold, implementing a restrictive secuiiu policy on Ihe compuler. In some instances, a security policy may be invoked that takes general security measures, such as blocking most, if not all. incoming and outgoing network traffic In other instances, the restrictive security policy may limit the resources available to an entity, so that a computer may not be reinfected with malware.
Another aspect of the present invention is a software system that aggregates the knowledge base of a plurality of anti-malware services and/or event detection systems to piotecl a computer from malware. In one embodiment of the present invention, the software system includes a data collector component, a data analyzer module, and a policy implemenler Ihe dala collector component is operative to collect data from differenl anti-malwaiv systems and/or event detection systems installed on a computer. In this embodiment., the dala collected describes suspicious events that are potentially indicative of malware At various times, the dala analyzer module may make a determination regarding whethei data collected by the data collector component, taken as a whole, is indicative of malware If (he data analyzer module determines malware exists with sufficient certainty, ihc policy implementei may impose a restrictive security policy that restricts access lo resources o! the computer.
In siiil in oilier embodiment, a computer-readable medium is provided with contents, /<•., a program that causes a computer to operate in accordance with Ihe method described heiein.
HRII-1- DI-SCRIITION OF Till.: DRAWINGS
.The foregoing aspects and many of the attendant advantages of this invention will
become more readily appreciated as the same become better understood by reference to the tollowmg deiailcd description, when taken in conjunction with the accompanying drawings, wherein'
I'Kil )R|-; 1 is a pictorial diagram illustrating a conventional networked environment 11) over which malvvare is commonly distributed;
FIGURE 2 is a block diagram illustrating an exemplary timeline that demonstrates how a vulnerability window may occui in the prior art;
FIGURE .1 is a block diagram that illustrates components of a computer capable of aggregating ihe knowledge base of different anti-malware services and other event eolleclion ! ; systems in.slalled on a computer to proactively protect the computer from malvvare in accordance wilh the present invention; and
FIGURE 4 is a flow diagram illustrating one embodiment of a method implemented
in a computer that protects Ihe computer from malvvare in accordance with the present
invention
DETAILED DESCRIPTION
In accordance wilh the present invention, a system, method, and computer-readable
medium tor aggregating the knowledge base of a plurality of security services and/or other
event dcleciion systems to protect a computer from malvvare is provided. Although the
present invenlion will primarily be described in the context of protecting a computer from
malvvare iisnm different anti-nialware services, those skilled in Ihe relevant art and others
will appnviale Ihat ihe present invention is also applicable to other software systems than
those describee! For example, aspects of the present invention may be configured to use any
one ol event detection systems that are currently available or yet to be developed. The
following description first provides an overview of aspects of a software syslem in which the
present invention may be implemented. Then, a method that implements the present
invention is described. The illustrative examples provided herein are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Similarly, any steps described herein may be interchangeable with other steps or combinations of steps in order to achieve the same result.
Now with reference to FIGURE 3, components of a computer 300 that is capable of implementing aspects of the present invention will be described. The computer 300 may be any one of a variety of devices including, but not limited to, personal computing devices, server-based computing devices, personal digital assistants, cellular telephones, other electronic devices having some type of memory, and the like. For ease of illustration and because it is not important for an understanding of the present invention, FIGURl: 3 does not show the typical components of many computers, such as a CPU, keyboard, a mouse, a printer, or oilier I/O devices, a display, etc. Flowever, the computer 300 depicted in FKiUKF > does include antivirus software 302, a firewall application 304, a behavior blockei 306, anti-spy ware software 308, ami a metrics system 309. Also, the computer 300 implements aspects of the present invention in an aggregation routine 3 10 that includes a data collector component i1 2, a data analyzer module 314, and a policy implementcr .3 16.
I'hose skilled in the art and others will recognize that an increasing number of anti-malvvare security services are being made available to protect against all different types of malware at various entry points or data streams on a computer. For example, one defense (hat is common today for protecting a computer against malware is antivirus software 302. (.ienerallv described, traditional antivirus software 302 searches data being accessed from an inpiil/oiilpu! ("l/()") device, such as a disk, for a "signature" that is characteristic of malware. Also, increasingly antivirus software 202 is performing heuristic malware detection techniques designed to measure activities thai are characteristic of malware.
Another defense that is common today in protecting against computer malware is a firewall application 304. Those skilled in the art will recognize that the firewall application 304 is an anti-malvvare system thai protects an internal network from unauthori/ed access originating from external networks by controlling the flow ot information between the internal network and the external networks. All communication originating outside of the internal network is sent through the firewall application 304, which
examines the communication and determines whether it is safe or permissible to accept the commimicaiion.
Another anli-malware service that is currently available is a behavior blocker 306, winch implements policies designed to allow benign activities to occur while interceding when aclivihcs that are contrary to policy are scheduled. Typically, a behavior blocker 306 implements a "sandbox" in which code that is potentially infected with rnalware is analy/ed lo determine whether the code performs an unacceptable behavior. For example., an unacceptable behavior may lake the form of generating a mass e-mailing that will be distributed lo entities found in a users' address book. Similarly, unacceptable behavior may be delined as making changes lo multiple entries in important databases, like a system registry. In any even!, the behavior blocker 306 analy/es programs and implements policies designed to prevent unacceptable behaviors.
Increasingly, other types of anli-malvvare services are being developed to identify and "disinfect" dil'ferenl types of malware from a computer. For example, the anti-spyvvare soflwaic >()S is designed to idcnlify programs that track actions performed by a user. While spyware ma> not cause damage to the computer 300, as occurs with other types of malware, some users find it iinasive to have their actions tracked and reported to an unknown entity, in this instance, a user may install the anli-spyware software 308 that identifies and disinfects this type of malware from a computer.
I ho.se skilled in Ihe art and others will recogni/e that certain event detection systems may monitor computer entry points, data streams, and/or computer events and activities. Typically event detection systems will not only provide logic for identifying events that occur on a computer but also maintain databases, event logs, and additional types ol resources lot obtaining data about the events observed. For example, as illustrated in FKil IRI- i, the computer 300 maintains a metrics system 309 designed to observe and record various perlornumce metrics of the computer 300. In this regard, the metric system 30() may monitor (Tl; usage, the occunence of page faults, termination of processes, and other performance characteristics of the computer 300. As described in further detail below, palterns in Ihe performance characteristics of the computer 300, and other events that occur, on Ihe computer may be indicative of a malware. While a specific event detection system
(e.g. the ineinc system 309) ha> been illustrated in described, those skilled in the art ami others will rccojini'/.c that other lype.s of event detection systems may be included in the computer 300 without departing Irom the scope the present invention.
I iiom: skilled in the art and others will recognize that the anti-nialware systems 302, 104, )()(). ^OK, ami the event detection system 309 described above with reference to FliiUKf 1 should be construed as exemplary and not limiting of the presenl invention. For example, Un: present invenlion may be implemented with so-called intrusion detection systems thai attempt to detect unauthorised access to the computer 300 by reviewing logs or other mlornuiiion available Irom a network without departing from the scope oJ the presenl linen I ion. Instead, aspects of the present invention may be implemented using different anli -muKvaic s\stems and other event detection systems than those illustrated and described Also, aspect.-; of the present invention may be implemented in conjunction with any number of anli-maluare services and event detection systems. For example, the anli-spywarc software >()S is represented with dashed lines to indicate that the present invention may be used if the computer 300 only included the antivirus software 302, the firewall application 304. the behavior blockei 306, and the metric system 309--and not the ariti-spywaic software 308. Thus, additional or fewer anti-malware services and event detection systems ma\ he added or removed from the computer 300 in other embodiments ot the present invenlion.
While the accuracy of anti-malware services in detecting increasingly sophisticated maluare has improved, existing anti-malware services are limited to detecting malware in specilk domains. As a result, these stand-alone, anti-malware services have inherent limitations. 1 or example, the firewall application 304 detects malware by monitoring incoming and outgoing network activity and is limited by the manner in which data is transmitted over a network. Those skilled in the art and others will recogni/.e that a client-based computer typically requests one or more files when obtaining data from a server-based computer. In (his instance, components of modern networks segment the file into smaller units ("packets") in order to transmit the paekets over a limited bandwidth network connection. I he packet* are transmitted over the network and are individually scanned by the firewall application 104 for malware when they arrive on the client-based computer.
Thus, the fire wall application 304 may not have a complete file when scanning packets for inaKvaiv. and, as a result, may not he able to positively detect malware in all instances.
l\cn though the lire wall application 304 may not be able to positively detecl malware in ail instances, the firewall application 304 may collect, or he easily configured to colled, data that is a strong heuristic indicator of a malware infection. For example, firewall applications typically monitor network activity, which may include "deep" monitoring of the contents ol the packets lor suspicious data that may be characteristic of malware. In addition, many firewall applications maintain statistics regarding the volume of network activity that is occurring on a computer. A strong heuristic indicator that a malware is attempting to infect a computer, which may be derived from statistics maintained by the firewall application 304, exists when a significant increase in network activity is detected. I5\ itsell. an increase in network activity is not necessarily indicative of malware. Instead, I here are legitimate reasons why a computer may be transmitting or receiving an increased volume ot daia (i'.tf, a user began downloading large multimedia files over the network). If this type of event was used by the firewall application 304 to positively identify a malware infection, a high number of "false positives" or instances when a malware was incorrectly identified would occur
Other anti-malvvare services and event detection systems also observe heuristic indicators thai a computer is either infected with malware or a malware is attempting to infect the computer, for example, a specific type of malware known as spyware needs to be active on ;i computer at all times to track the actions of a user. To be activated at computer start up, spyware will register with one or more "extensibility points" of an operating system such as a Service Control Manager ("SCM") or registry key. Similar to the example provided above, registering a program at an extensibility point of an operating system is not itself a positive indicator thai the program is malware. However, registering with an extensibility point is a "suspicious" event that may be indicative of malware. The present invention is dirccied at collecting and leveraging the knowledge provided by these types of suspicions events to provide proactive protection from malware.
As mentioned above, (he computer 300 maintains an aggregation routine 3 10 that includes a data collector component 312, a data analyzer module 314, and a policy
implcmcnlcr Hd. In genera! terms describing one embodiment ol' Ihe present invention, the data collector component 3 111 obtains data From anti-malware services and event detection systems installed on the computer "UK) (e.g. the antivirus software 302, the firewall application 104, ihe behavior Mocker 306, the anti-spyware software 30X, and the metric system 30C)) legardmg "suspicious" events. As described in more detail beiovv with reference to l;l( ii !RK 4. the daia collected may be merely an indicator from an anti-malware service or event detection system thai a suspicious event occurred. Also, the data collector component l 2 may obtain metadata from an anti-malware service or event detection system that describes attributes ol a suspicious event. In either instance, the data collector component i\2 serves as an interlace to anti-malware services and event detection systems installed on the computer 300 for reporting and/or obtaining data regarding .suspicious events.
As illustrated in FIGURE 3. the aggregation routine 310 also includes a data analyzer module 314. which determines whether the suspicious events reported to and/or collected by the data collector component 3 12 satisly a predetermined threshold. As described in further detail below wild reference to l;UHiRK4, when the threshold is satisfied, an entity (flwarc i()S to block the process from performing this type of activity.
fhe present invention may be implemented in a number of different: contexts, ol which the following arc examples. Existing anti-malware services are able to identify events that aie positive indicators of malwarc and suspicious events that may be characteristic ot malware. II entities associated with suspicious events were "marked" as malware, then an excessive number of false positives, or instances when an entity was incorrectly identified as malware, would occur. Nonetheless, the knowledge that an entity is associated with events idenlilicd as being suspicious by either anti-malware services or event detection systems is helpful uhen proactively protecting a computer against malware. The present invention may he implemented in this type of existing infrastructure to aggregate the knowledge of different
anii-malsvaic sci vices and event detection systems. More specifically, disparate types ol anti-malvvaic services (<';,', (he antivirus software 302, the firewall application 304, the behavioi blocker ?•()(>, and the anti-spyware software 308) and event delection systems (c i,'., the metrics system 30')) may he configured to report suspicious events to a software module lhal implements aspects of the present invention (e.g., the aggregation routine 3 10). If Ihe number or ts pe of suspicious events observed by the anti-malvvare services or event delection systems satisfies the threshold, then the aggregation routine 310 will "mark" an entity associated with the events as being malware.
Those skilled in the arl and others will recognise that F1GURK3 is a simplified example ol one computer 300 that is capable of performing the functions implemented by the present nueniion. Actual embodiments of the computer 300 will have additional components no! illustrated in FIGUKH 3 or described in the accompanying text. Also, l;l(il IRI1! 3 shows an exemplary component architecture for proactively protecting a computer 30ti horn muKvare--bi.it other component architectures are possible.
Nov. will) reference lo I'Kil iRIi 4, an exemplary embodiment of the aggregation routine MO. illustrated in f KiUKK 3, which determines whether suspicious event identified by an anii-maKvaiv service or other event detection systems is characteristic of malware, will be described
As illustrated in FIGURF, 4, the aggregation routine 310 begins tit block 400 where the routine >l() remains idle until a suspicious event is observed by an anli-malvvare service or other e\ent. detection system. In accordance with one embodiment of the present invention thai involves aiiti-malware services, logic in a service defines both positive indicator:, ol a malware infection and suspicious events that may be characteristic oi malwaiv If a positive indicator of a malware infection is identified, then software routines implemented by the aggregation routine 3 10 will not be executed. However, in instances when a suspicious event is identified, the event is reported and an analysis is performed to determine whether an entity associated with the suspicious event should be "marked" as malware. for example, those skilled m the art and others will recogni/e that most malware are enci)pl.:d lo avoid being detected in transit and will be decrypted before execution. Similar to the examples provided above with reference to FKrUR.H 3, when an anti-malvvare

service encounters an encrypted file, for example, this is not in itself a positive indicator thai the file ronl.ims malwaic. However, encountering an encrypted file i.s a "suspicious" event thai is i cpoi ted to aspects of the present invention.
AI block 40.'. the suspicions event identified, at block 400, is reported to the
aggregation louiine 3 10. It should be well understood that the present invention may be
implemented in a number of different embodiments of which the following are only
examples In one embodiment, the aggregation routine 310 is implemented in an integrated
sot i ware sysiem created by a single software provider. For example, the antivirus
softwuie 30J. the firewall application 304, the behavior blocker306, the anti-spyware
so ft \varc iOH. and metrics system 309 illustrated in FIGURE 3 may be integrated together
with the aggregation routine 310. In this instance, the anti-malware
services 30?. >04, 300, 30X, and event detection system 309 may he configured to directly pass data that describes attributes of a suspicious event, at block 402, using methods that are generally known in ihe art. In an alternative embodiment of the present invention, the aggregation loutine 510 maintains an Application Program Interlace ("API") that allows thial-part.v providers to report suspicions events. In this instance, anti-malware services or other event detection system created by third parties may "plug-in" to the aggregation routine 310 and report suspicious events by issuing one or more API calls. In accordance with yet another alternative embodiment, the aggregation routine 310 actively obtains data that describes suspicious events from one or more resources on a computer. For example, as described pivviouslv with reference to FICiURH 3, an event detection system (f,(,'., the metrics system i()9) may observe and record different events that occur on a computer TypicalK. e\eni detection systems will not only provide logic for identifying events that occur on a computer but also maintain databases, event logs, and additional types ol resources that are available to other software modules. In this instance, the aggregation routine >IO may obtain data that describes suspicious events from resources maintained by an event detection system.
As further illustrated in FIGURE 4, the aggregation routine 310, at block 404. performs an analysis on the data that describes suspicious events received from the anti-malware services or collected from other sources such as event detection systems. The
analysis pci formed is designed to determine whether suspicious events reported lo or obtained In the aggregation routine 310 satisfy a predetermined threshold that indicates maKvarc is eiiher allenipiing lo infect the computer or has already infected the computer. For example, a malwarc author releases a new malware that exploits a previously unknown vulnerability. The malware ( 1 ) employs a network port that is infrequently used lo access a computer. ( ') is contained in an encrypted file when saved on a storage medium, such as a disk. ( i) attempts to access an operating system extensibility point, and (4) causes a large quantity ol data lo he transmitted to other network accessible computers with a corresponding increase in C PI usage. As mentioned previously, in one embodiment of the present invention, the data collected from the anti-malware services or other event detection systems may he merely an indicator that a suspicious event was identified. In this embodiment, an anti-malware service, such as the firewall application 304, may be configured to report that a suspicious event occurred when an infrequently used network port is accessed. Moreover, since the malware causes a large quantity of data to be transmitted to other network accessible computers, the firewall application 304 may determine that an increase in network activity is also a suspicious event. Then, another anti-malware service, such as the anti-spyware software 308, may report the occurrence of a suspicious event when an extensibility point of an operating system is accessed. Merely receiving three reports of suspicion;-, events may not satisfy the predetermined threshold. However, the metric system i()1.) may then record, in an event log, that CPU usage has increased dramatically. In this insiancv. the data collector component 312 may be configured to monitor the event log and determine that a suspicious event occurred as a result of the increase in CPU usage. The predeteimined threshold applied by the aggregation routine 310 may be satisfied, in this instance, when four (4) suspicions events occur in a specific time frame. However, those skilled m the art and others will recognize that the example provided above where four (4) suspicious i \enis are enough lo satisfy the predetermined threshold is merely an example used for illustrative purposes and should not be construed as limiting on the present
In another embodiment of the present invention, the data collected by the aggregation routine i 10 includes metadata that assists in determining whether an entity associated with a
lei is nuihvarc Those skilled in the art and others will reeogni/.e thai some suspicious events are mote likek to he associated with malware than other suspicious events, hi one embodimeni ol ihe present, invention, the anti-inalware services on a computer are configured lo compute a value that represents the probability that one or more suspicious
:< events is as.-,ociaicd with malware. In the example provided above, an increase in network activity ma\ be assigned a high value by the firewall application 304, which indicates that a high probability exists thai malware is attempting to infect the computer, infect other computers, attack other computers, or leak information. Conversely, saving an encrypted lile on a storage medium is less likely to be associated with malware and would therefore be
Hi assigned a lovvei value. In accordance with one embodiment of the present invention, metadata is reported to the aggregation routine 310 that represents the probability that a suspicious event is characteristic of malware. In this instance, the predetermined threshold may he satisfied, loi example, when one or more suspicious events are reported with iiu'ladaia that indicates a high probability that a malware attack is occurring.
i i It should be well understood that suspicious events reported by anti malware services
may lie associated with different entities. For example, a user may download an encrypted file from a network. As mentioned previously, since the tile is encrypted, an anli-malware service may report the downloading of the file to the aggregation routine 3 10 as a suspicious event. Also the anli-malware service may associate metadata with the file that represents the
''.l() may "mark" an entire computer, a process, an activity as being associated with malware.
In \ei another embodiment of the present invention, metadata reported to the aggregation routine 3 10 by an anti-malware service may be used by other anli-malware services to characlen/e an entity. For example, in the example provided above, the firewall
iii application >04 reports to the aggregation routine 310 thai an encrypted lile was downloaded
Iroin the network In this instance, inetaclala may be associated vvilh the Hie thai indicates the reason the lile was "marked" as suspicious by the firewall application 304 (<'.l;.. the (lie is encrypted) Ifllie Ille is later associated with an attempt to access an extensibility point of an operating s\siem. for example, the behavior blocker 306 may issue a query and obtain metadata associated with the Illc. In this instance, the behavior blocker 306 may use the metadata to more accurately characterize the file. For example, experience in analyzing nuilware ma\ indicate that the combination of both being encrypted and accessing an operating system extensibility point may, in combination, be a highly suspicious event. As a result, the behavior blocker 30<> may then positively identify the file as being infected vvilh malvvarc
At decision block 406, the aggregation routine 310 determines whether the suspicious eveiU(s) analyzed til block 404 satisfy the predetermined threshold.. If the predetermined threshold was satisfied, an entity (e.g., a computer, a lile, a process, etc.) is "marked" as being associated with malwaiv. In this instance, the aggregation routine 3 10 proceeds to block 40X described below. Conversely, if the predetermined threshold was not satisfied, the aggregation routine 3 10 proceeds back to block 400 and blocks 400 through 406 repeal until Ihe threshold is satisfied.
As illustrated in Fl(.il IRLi 4, at decision block 408, the aggregation routine 310 determines whether any registered anti-rnalware services are capable of removing the malvvarc from the computer. As mentioned previously, the aggregation routine 3 10 allows anti-malwaiv services to register and create a profile that identifies the types of inalware the service is capable of removing from a computer. If block 410 is reached, a malware may have infected a computer and a registered anti-malware service may be capable of removing the malware from the computer. In this instance, metadata collected by the aggregation routine 310 may be used to identify the malware and an anti-malware service that is capable of removing the malware from the computer. If an appropriate anti-malware service is identified, the aggregation routine 310 causes the anti-malware service to remove the malware. at block 4 10 using methods generally known in the art. Then the aggregation routine 310 piocceds to block 412. Conversely, if the malware is only attempting to infect
the computa or an anli-malwarc service is not able to remove the malware from the computer, ilv aggregation routine 310 skips block 410 ami proceeds directly to block 412
At block 412, the aggregation routine 310 implements a restrictive security policy designed lo prevent ihe spread of or infection by the malware. If block 414 is reached, a malware was identified and the computer may or may not still be infected with the malware. In instances when the computer is infected, a general restrictive security policy will typically be implemented lhal is designed to prevent the spread of the malware. For example, implementing the general security policy will typically include applying multiple restrictions on resources such as, ln.it not limited to, restricting network transmissions from the computer; blocking network traffic on specific communication ports and addresses; blocking communications 10 and/or Irani certain network related applications, such as e-mail or Web browser applications; tenuinaiing certain applications, and blocking access to particular hardware and software components on a computer. In other instances, the aggregation routine 310 may have removed the malware from a computer so that it is no longer infected. Typically, in this instance, a less restrictive security policy will be implemented and that is designed to prevent ihe computer from being re-infected with the malware. Then Ihe aggregation routine 3 10 proceeds to block 414 where it terminates.
It should be well understood that the restrictive security policy implemented at block 414, max be easily disengaged if a determination is made that an entity is not malware. hoi example a system administrator or user may determine that a file identified as containing malware is, in fact, benevolent. In this instance, the restrictive security policy may be disengaged by a command generated from the user, system administrator, or aulom.ilicalh as a icsiilt of future learning.
While the preferred embodiment of the invention has been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.

The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
! A computer-implemented method of collecting local machine events and
aggregating the knowledge base of anti-malware services and other event detection systems to prouehvcK protect a computer from malwarc, the method comprising:
(a) using the anti-malware services and other event detection systems to
observe suspicious events that are potentially indicative of malware;
(b) determining whether the suspicious events satisfy a predetermined
threshold, and
(c) if the suspicious events satisfy the predetermined threshold, applying a
restrictive seeuriiv policy to the computer.
2. The method as recited in Claim 1, wherein using the anti-malware services and other event detection systems to observe suspicious events that are potentially indicative ol'malware includes receiving metadata that describes a suspicious event.
"> The method as recited in Claim 2, wherein the metadata that describes the
suspicious event is accessible to the anti-malware services for characterizing an entity.
4 The method as recited in Claim 2, wherein the metadata that is received that
describes ihe suspicious event includes:
(a.) a weighted value generated by an anti-malware service that quantifies the probability that the suspicious event is indicative of malware; and
(b) the reason the event was identified as being suspicious.
x The method as recited in Claim 1, wherein determining if the suspicious events satisfy I he threshold includes determining if the number of events for a given lime frame is liiuher than a given value.
<>. The method as recited in Claim 1, wherein determining if the events satisfy a threshold mdicalive o('malware, includes:
(a) generating a weighted value lor each suspicious event that quantities
Ihe probability ihal the suspicious event is indicative of malware; and
(b) determining whether the summation of the weighted values for the
suspicious events is higher than a given value.
7 The method as recited in Claim 1, wherein the restrictive security policy
prevents Ihe einiiy associated with the observed suspicious events from performing actions and accessing resources on the computer in a way that is contrary to the policy.
8, The method as recited in Claim 1, wherein applying a restrictive security policy to the computer, includes:
(a) determining whether the entity is capable of being removed from the computer:
(h) if the entity is capable of being removed from the computer, causing an anti-malv\are service to remove the entity from the computer.
(c) conversely, if the entity is not capable of being removed, applying a
general restrictive security policy designed to prevent the spread of malware.
l) The mcihod as recited in Claim 8, wherein causing an anli-nuilwarc service to
remove the entity from the computer includes applying a restrictive security policy designed to prevent the malware from subsequently infecting the computer.
ID The method as recited in Claim 8, wherein causing an anti-malware service to remove the enliiv includes allowing the anti-malware services to register and identify the lypes of malware that the anti-malware service is configured to remove from a computer.
! 1 The method as recited in Claim 8, wherein implementing a restrictive security policy includes restricting the ability ol the computer to access data on a network.
I'-. The method as recited in Claim 11, wherein restricting the ability of the
computer u> access data on the network, includes:
(LI i blocking nelwork traffic on specific communication ports;
(h) blocking communications from certain network-base applications;
(ci blocking access to hardware and software components on the
computer and
(d) blocking network traffic on specific communication ports and
addresses.
1.v A software system thai proactively protects a computer from malware, (lie software system comprising:
(a) an aggregation routine for determining whether an entity associated with the computer is malware, wherein the aggregation routine includes:
(i) a data collector component operative to collect data that identifies suspicious events poientially indicative of malware;
I'ii) a data analy/er module that analyzes data collected by the data coilecioi component to determine whether a threshold was satisfied; and
(in) a policy implementer operative to implement a restrictive
security policy when the data analyzer component module determines that the threshold was satisfied; and
14. The software system as recited in Claim 13, further comprising an anti-malvvare sen ice for identifying and reporting suspicious events to the data collector component that are potentially indicative of malware.
I 5. The software system as recited in Claim 14, wherein the anti-malware service is further configured to identify the entity that is associated with the suspicious events.
Id 1 he software system as recited in claim 13, further comprising in the event collection s\Mem for identifying events that occur on a computer and recording the events in a data store that is accessible lo the aggregation routine
' I lie soil ware system as recited in Claim 13. wherein the aggregation routine
is further coiitigureil to:
(a) allow ihe anli-malware service to register to identify malware that the
service is capable oi removing from the computer; and
(h) determine from the registration data whether the anli-malware service is capable ol removing the malware from the computer when the data analyzer module determines Ihat the threshold has been satisfied.
IN. The sollvvaie system as recited in Claim 13, wherein the data collector component is configured to receive and store metadata that
la) describes the probability that a suspicious event is characteristic of malvvaie; ami
(b) identifies (he reason that an event was marked as suspicious by the
anli-malware service.
I1'. A computer-readable medium bearing computer-executable instructions that, when executed on a computer that includes an anli-malware service, causes the computer to:
(a) use the anli-malware service to observe suspicious events that are
poicntialiy indicative ol malware,
(b) receive data from the anti-malware service that describes the
suspicious i,-\ ems;
(c) determine whether the suspicious events observed are indicative of
malware: and
(c) if the suspicious events are indicative of a malware, implement a restrictive security policy that restricts an entity associated with suspicious events from performing actions on the computer.
20. The computer readable medium as recited in Claim 19, wherein the computer is lurtlier configured to:
fa) determine whether a malware infection exists; and
'I1' il 'i m.'ilwaiv infection exists, prevent the computer from puico communicatively connected to the computer.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 535-del-2006-Form-18-(06-02-2009).pdf 2009-02-06
1 535-DEL-2006-RELEVANT DOCUMENTS [15-09-2023(online)].pdf 2023-09-15
2 535-del-2006-Correspondence-others-(06-02-2009).pdf 2009-02-06
2 535-DEL-2006-RELEVANT DOCUMENTS [26-09-2022(online)].pdf 2022-09-26
3 535-DEL-2006-RELEVANT DOCUMENTS [22-09-2021(online)].pdf 2021-09-22
3 535-DEL-2006-GPA-(21-06-2010).pdf 2010-06-21
4 535-DEL-2006-RELEVANT DOCUMENTS [27-03-2020(online)].pdf 2020-03-27
4 535-DEL-2006-Correspondence-Others-(21-06-2010).pdf 2010-06-21
5 535-DEL-2006-IntimationOfGrant26-08-2019.pdf 2019-08-26
5 535-DEL-2006-Form-1-(02-12-2010).pdf 2010-12-02
6 535-DEL-2006-PatentCertificate26-08-2019.pdf 2019-08-26
6 535-DEL-2006-Correspondence-Others-(02-12-2010).pdf 2010-12-02
7 535-DEL-2006-Response to office action (Mandatory) [31-07-2019(online)].pdf 2019-07-31
7 535-del-2006-gpa.pdf 2011-08-21
8 535-del-2006-form-5.pdf 2011-08-21
8 535-DEL-2006-Correspondence-110719.pdf 2019-07-19
9 535-del-2006-form-3.pdf 2011-08-21
9 535-DEL-2006-Power of Attorney-110719.pdf 2019-07-19
10 535-DEL-2006-Correspondence-020719.pdf 2019-07-09
10 535-del-2006-form-2.pdf 2011-08-21
11 535-del-2006-form-1.pdf 2011-08-21
11 535-DEL-2006-Power of Attorney-020719.pdf 2019-07-09
12 535-del-2006-drawings.pdf 2011-08-21
12 535-DEL-2006-Written submissions and relevant documents (MANDATORY) [09-07-2019(online)].pdf 2019-07-09
13 535-del-2006-description (complete).pdf 2011-08-21
13 535-DEL-2006-FORM-26 [21-06-2019(online)].pdf 2019-06-21
14 535-DEL-2006-Correspondence to notify the Controller (Mandatory) [07-06-2019(online)].pdf 2019-06-07
14 535-del-2006-correspondence-others.pdf 2011-08-21
15 535-del-2006-claims.pdf 2011-08-21
15 535-DEL-2006-HearingNoticeLetter.pdf 2019-05-24
16 535-del-2006-abstract.pdf 2011-08-21
16 535-DEL-2006_EXAMREPORT.pdf 2016-06-30
17 MTL-GPOA - PRS.pdf ONLINE 2015-03-05
17 Abstract [01-02-2016(online)].pdf 2016-02-01
18 Claims [01-02-2016(online)].pdf 2016-02-01
18 MS to MTL Assignment.pdf ONLINE 2015-03-05
19 Correspondence [01-02-2016(online)].pdf 2016-02-01
19 FORM-6-801-900(PRS).5.pdf ONLINE 2015-03-05
20 Description(Complete) [01-02-2016(online)].pdf 2016-02-01
20 MTL-GPOA - PRS.pdf 2015-03-13
21 Examination Report Reply Recieved [01-02-2016(online)].pdf 2016-02-01
21 MS to MTL Assignment.pdf 2015-03-13
22 FORM-6-801-900(PRS).5.pdf 2015-03-13
22 OTHERS [01-02-2016(online)].pdf 2016-02-01
23 new covering letter.pdf 2015-06-04
23 Other Document [20-08-2015(online)].pdf 2015-08-20
24 Petition Under Rule 137 [20-08-2015(online)].pdf 2015-08-20
24 DEtails under section 8.pdf 2015-06-04
25 DEtails under section 8.pdf_3720.pdf 2015-06-24
25 new covering letter.pdf_3721.pdf 2015-06-24
26 new covering letter.pdf_3719.pdf 2015-06-24
27 DEtails under section 8.pdf_3720.pdf 2015-06-24
27 new covering letter.pdf_3721.pdf 2015-06-24
28 DEtails under section 8.pdf 2015-06-04
28 Petition Under Rule 137 [20-08-2015(online)].pdf 2015-08-20
29 new covering letter.pdf 2015-06-04
29 Other Document [20-08-2015(online)].pdf 2015-08-20
30 FORM-6-801-900(PRS).5.pdf 2015-03-13
30 OTHERS [01-02-2016(online)].pdf 2016-02-01
31 Examination Report Reply Recieved [01-02-2016(online)].pdf 2016-02-01
31 MS to MTL Assignment.pdf 2015-03-13
32 Description(Complete) [01-02-2016(online)].pdf 2016-02-01
32 MTL-GPOA - PRS.pdf 2015-03-13
33 Correspondence [01-02-2016(online)].pdf 2016-02-01
33 FORM-6-801-900(PRS).5.pdf ONLINE 2015-03-05
34 Claims [01-02-2016(online)].pdf 2016-02-01
34 MS to MTL Assignment.pdf ONLINE 2015-03-05
35 Abstract [01-02-2016(online)].pdf 2016-02-01
35 MTL-GPOA - PRS.pdf ONLINE 2015-03-05
36 535-DEL-2006_EXAMREPORT.pdf 2016-06-30
36 535-del-2006-abstract.pdf 2011-08-21
37 535-DEL-2006-HearingNoticeLetter.pdf 2019-05-24
37 535-del-2006-claims.pdf 2011-08-21
38 535-DEL-2006-Correspondence to notify the Controller (Mandatory) [07-06-2019(online)].pdf 2019-06-07
38 535-del-2006-correspondence-others.pdf 2011-08-21
39 535-del-2006-description (complete).pdf 2011-08-21
39 535-DEL-2006-FORM-26 [21-06-2019(online)].pdf 2019-06-21
40 535-del-2006-drawings.pdf 2011-08-21
40 535-DEL-2006-Written submissions and relevant documents (MANDATORY) [09-07-2019(online)].pdf 2019-07-09
41 535-del-2006-form-1.pdf 2011-08-21
41 535-DEL-2006-Power of Attorney-020719.pdf 2019-07-09
42 535-DEL-2006-Correspondence-020719.pdf 2019-07-09
42 535-del-2006-form-2.pdf 2011-08-21
43 535-del-2006-form-3.pdf 2011-08-21
43 535-DEL-2006-Power of Attorney-110719.pdf 2019-07-19
44 535-DEL-2006-Correspondence-110719.pdf 2019-07-19
44 535-del-2006-form-5.pdf 2011-08-21
45 535-del-2006-gpa.pdf 2011-08-21
45 535-DEL-2006-Response to office action (Mandatory) [31-07-2019(online)].pdf 2019-07-31
46 535-DEL-2006-PatentCertificate26-08-2019.pdf 2019-08-26
46 535-DEL-2006-Correspondence-Others-(02-12-2010).pdf 2010-12-02
47 535-DEL-2006-IntimationOfGrant26-08-2019.pdf 2019-08-26
47 535-DEL-2006-Form-1-(02-12-2010).pdf 2010-12-02
48 535-DEL-2006-RELEVANT DOCUMENTS [27-03-2020(online)].pdf 2020-03-27
48 535-DEL-2006-Correspondence-Others-(21-06-2010).pdf 2010-06-21
49 535-DEL-2006-RELEVANT DOCUMENTS [22-09-2021(online)].pdf 2021-09-22
49 535-DEL-2006-GPA-(21-06-2010).pdf 2010-06-21
50 535-DEL-2006-RELEVANT DOCUMENTS [26-09-2022(online)].pdf 2022-09-26
50 535-del-2006-Correspondence-others-(06-02-2009).pdf 2009-02-06
51 535-del-2006-Form-18-(06-02-2009).pdf 2009-02-06
51 535-DEL-2006-RELEVANT DOCUMENTS [15-09-2023(online)].pdf 2023-09-15

ERegister / Renewals

3rd: 28 Oct 2019

From 28/02/2008 - To 28/02/2009

4th: 28 Oct 2019

From 28/02/2009 - To 28/02/2010

5th: 28 Oct 2019

From 28/02/2010 - To 28/02/2011

6th: 28 Oct 2019

From 28/02/2011 - To 28/02/2012

7th: 28 Oct 2019

From 28/02/2012 - To 28/02/2013

8th: 28 Oct 2019

From 28/02/2013 - To 28/02/2014

9th: 28 Oct 2019

From 28/02/2014 - To 28/02/2015

10th: 28 Oct 2019

From 28/02/2015 - To 28/02/2016

11th: 28 Oct 2019

From 28/02/2016 - To 28/02/2017

12th: 28 Oct 2019

From 28/02/2017 - To 28/02/2018

13th: 28 Oct 2019

From 28/02/2018 - To 28/02/2019

14th: 28 Oct 2019

From 28/02/2019 - To 28/02/2020

15th: 28 Oct 2019

From 28/02/2020 - To 28/02/2021

16th: 13 Jan 2021

From 28/02/2021 - To 28/02/2022

17th: 11 Jan 2022

From 28/02/2022 - To 28/02/2023