Abstract: A system and method for assessment in service assurance workflows for an organization, in terms of ranks and scores, said system comprising: at least two enterprise modules which completely define the organisation, each of said module being defined by a weighted score and a rank; at least one questionnaire template corresponding to each of said enterprise modules adapted to propose questions relating to functioning of each of said enterprise modules; scoring means adapted to provide an individual aggregated score for each of said enterprise modules; ranking means adapted to rank each of said enterprise modules in accordance with desired organisation function (output); weighting means adapted to allocate weights to each of said enterprise modules in accordance with desired organisation function (output); first computation means adapted to compute an unweighted maturity level score in relation to said rank and said individual aggregated score; second computation means adapted to compute an assigned category weighting score in relation to said rank and said individual aggregated score; third computation means adapted to compute a weighted maturity value in relation with said unweighted maturity level score and said assigned category weighting score; database means adapted to store pre-defined quantified weighted maturity values in correlation with their qualitative descriptions relating to organization function (output); and look-up means adapted to look-up said computed weighted maturity level with respect to values from said database means in order to provide a correlated qualitative description relating to organization functioning (output) assessment.
FORM-2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
PROVISIONAL SPECIFICATION
(See section 10; rule 13)
SERVICE ASSURANCE AND ASSESSMENT SYSTEM
TATA CONSULTANCY SERVICES LTD.,
an Indian Company
of Nirmal Building, 9th floor, Nariman Point, Mumbai 400 021,
Maharashtra. India:
THE FOLLOWING SPECIFICATION DESCRIBES THE INVENTION
Field of the Invention:
This invention relates to artificial intelligence.
This invention relates to systems and tools for analysis and decision making.
Background of the Invention:
Work-processes in an enterprise typically assume an assembly line status, wherein a structure is established; various structures in a cascaded manner cater to various processes and demands, and aim to achieve synchronous behaviour for effortless working.
Thus interdependency and coherency between various structures to achieve a cumulative end product deriving its features from the varying embodiments and mechanisms is achieved. While each embodiment may function of its own pre-defined accord, it is necessary to check whether the pre-defined options are met. It is further an important aspect to see to it that the various functions of the various embodiments are customized so that a range of settings may be achieved.
Finally, a rank or score of the processes, as an individual, or as a whole may be crucial as a quantitative parameter for assessing and decision making in relation to tweaking the settings of the various embodiments.
Objects of the Invention:
An object of the invention is to provide an automated system for analyzing and decision making of the system.
Another object of the invention is to provide a ranking or score of the functioning of the various means of a system or an enterprise.
Yet another object of the invention is to provide customized options for tweaking the functioning of the embodiments of the system.
Brief Description of the Accompanying Drawings:
The invention will now be disclosed in relation to the accompanying
drawings, in which:
Figure 1 illustrates a schematic of the system.
Description of the Invention:
According to this invention, there is provided a system for analysis and decision making (100), particularly shown in Figure 1, said system being automated, and adapted to receive inputs from a plurality of enterprise modules in a dynamic manner. Said system is further adapted to provide a customized rating for each of the modules, and thus, also provide an assimilated score for analysis and decision making.
In accordance with an embodiment of this invention, said system is broken down to include a plurality of means, said means comprising:
- functionality means (FM) adapted to rate various aspects of various functionalities of an enterprise to provide a functionality score;
- simplification verification means (SVM) adapted to rate various aspects relating to ease-of-use of various functionalities of an enterprise to provide an ease-of-use score;
- architecture means (AM) adapted to rate various architectural components of an enterprise to provide an architectural score;
- cost derivation means (CDM) adapted to rate costs relating to establishment of various functionalities as well as running of various functionalities to provide an expense score;
- vendor support means (VSM) adapted to rate support mechanisms for various functionalities of an enterprise to provide a vendor support score;
- product roadmap means (PRM) adapted to rate usage and history of various embodiments of an enterprise to provide a product roadmap score.
In accordance with another embodiment of this invention, there is provided an aggregation means (AGGM) adapted to provide an aggregate output score, having its inputs from said functionality score, said ease-of-use score, said architectural score, said expense score, said vendor support score, and said product roadmap score.
In accordance with yet another embodiment of this invention, there is provided a range setting means (RSM) adapted to determine the range for which each of the aforementioned scores should contribute to achieve a final aggregate score. According to an exemplary embodiment, said functionality score may contribute towards 20% of the aggregate score, said ease-of-use score may contribute towards 12% of the aggregate score, said architectural score may contribute towards 20% of the aggregate score, said expense score may contribute towards 20% of the aggregate score, said vendor
support score may contribute towards 13% of the aggregate score, and said product roadmap score may contribute towards 15% of the aggregate score.
According to an embodiment of this invention, said functionality means (FM) includes:
- customer assurance database means for a process (CDP), adapted
to host a plurality of score receiving means, typically in relation to
the following:
1. Does the current process support customer reported problem?
2. Does the current process support customer reported problem on QoS/SLA?
3. Does the current process support service monitoring?
- service assurance database means for a process (SDP), adapted to
host a plurality of score receiving means, typically in relation to the
following:
1. Does the current process support ticket management process for Service related problems?
2. Does the current process support monitor SLA?
3. Does the current process support service status monitoring & intelligent reporting?
4. Does the current process support escalation when a soft threshold is crossed or the agreed QoS is violated
- network assurance database means for a process (NDP), adapted
to host a plurality of score receiving means, typically in relation to
the following:
1. Does the current process support resource fault isolation?
2. Does the current process support resource fault correlation?
3. Does the current process support resource re-configuration?
4. Does the current process support pro-active monitoring and system monitoring?
5. Does the current process support ticket management process for resource related problems?
6. Does the current process support resource trouble reporting process [to fulfillment or others]?
7. Does the current process support resource trouble to service impact mapping process?
8. Does the current process support pro-active maintenance?
9. Does the current process support outages and bulk recoveries?
10.Does the current process support field activities?
- customer assurance database means for a system (CDS), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Is the system capable to receive and acknowledge the problem report?
2. Is the system capable to measure actual quality of service for a customer?
3. Is the system capable of reporting customer trouble and ticket handling?
service assurance database means for a system (SDS), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Does the system provide service dashboards/reports based on customer segments, services and device types?
2. Option to monitor End-to-End service levels, per Customer, per service, per technology domain, per geographical region, etc.
3. Does the system provide option to model SLA templates for different customer segments (e.g. Bronze, Silver, Gold, with corresponding parameter)?
4. Does the system support service modeling?
5. Does the system support service QoS definitions?
6. Does the system generate alerts in case of SLA Violation?
7. Support to define SLA Templates?
8. Is the system capable to provide inputs to Billing/OSS application to calculate penalties for SLA Violations?
9. Does the system maintain service inventory?
10.Can the Service Inventory store service to resource association
details? lLIs the Service Inventory DB updated regularly with the current
Services details? 12.1s the Service Inventory DB integrated with the Resource
Inventory DB? 13.Do the system extract the service related KPIs from various
management platforms? 14.Capability of automatically open a trouble ticket when there is a
Service Impact identified for a customer(s) proactively?
- network assurance database means for a system (NDS), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Can the system initiate performance measurements in the network elements, collect and format network performance data?
2. Is the system Capable on KJPI reporting and monitoring for specific service performance?
3. Is the system capable to perform automatic discovery of networks, display and build topology maps?
4. Does the system maintain a model of the physical and logical network connectivity including the active, passive and logical network connectivity including service modeling?
5. Does the system support GIS under discovery / topology module?
6. Is the system capable to handle all events and alarms from all network elements?
7. Does the system support correlation and root-cause analysis?
8. Possible to send queries from fault management to the network inventory application to query the network inventory for number of attributes like nodes attributes, service attributes etc.
9. Does the system support I/O logic command handling to perform various tasks such as installation, configuration, functional change, administration, creation, deletion etc of the Domain Data at EMS level?
10.Does the system support auto recognition of configuration of any
network element? 11 .Possibility to create user defined reports
12.Capability of automatically open a trouble ticket when a network
fault occurs? 13.Option to support Quick test on all interface on the device 14. Option to support Quick test on the interface which is down 15.Option to execute diagnostics against resources 16. Ability to store the data for historic reporting 17.Support ready-to-go adaptors to retrieve or receive data from
EMS/NE 18.Does the system have open standard southbound & northbound
interfaces?
- general assurance database means (GDM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. How many NOCs are running and how are they distributed?
2. What networks are you managing? For more than one option then pis add your comments in the next column
3. What other infrastructure are you managing?
4. Is there a global NOC and how does its role differ?
5. Do you distinguish between telecom and IT monitoring?
6. To what extend do the NOCs also monitor the IT equipment and DCN required for the NOC?
7. More than one instance running (locations?) for each application. If yes, then please provide comments
According to another embodiment of this invention, said simplification verification means (SVM) includes:
- database means (SVDM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Do all the systems support web-based graphical user interface?
2. Online help at the screen level, field level available?
3. Are the screen designs as per GUI design standards?
4. Will the software support multiple languages?
5. Support for self care [For enterprise customers]?
6. Can the users perform health checks of the application?
7. Can the users save the reports in HTML, ASCII, CSV and PDF format?
8. Does the user Interface support TT life cycle management at Customer, Service and Network level?
9. Does the graphical display have end to end service record?
10. Support to define and view KPI/KQIs for all services?
1 l.Can the color change of Service/NE status based on health of the
NE or Service? 12.Can the performance reports be scheduled daily, weekly, monthly
and yearly? 13.1s a user interface available to define and view a complete
Topology of the network? 14.Can the graphical display have all the latest alarm status and data
(automatically updated) in real time? 15.1s a user interface available to support creation of filter and
suppression rules? 16.Do the User Interface support alarm life cycle management ? (
from active to clear state)
17.Do the User Interface support sending test request from the alarm
monitoring window? 18.Is it possible to import one or more layers of geographic maps
from ESRI or Maplnfo industry-wide GIS standards for data)? 19. Can users execute one custom action for multiple alarms or one
action per alarm? 20.Do the applications have System Administration GUI? 21 .Do the applications support logging levels that could be tuned from
a GUI? 22.Ease of Configuration through GUI Wizards 23.Easily accessible calendar view for configuration and report
generation 24.Support for multiple graphing models (line, histogram, pie etc)
According to an embodiment of this invention, said architecture means (AM) includes:
- technology platform evaluation database means (TDM), adapted to
host a plurality of score receiving means, typically in relation to the
following:
1. What is the % of application in n-tier architecture ?
2. How are the application coupled?
3. What is the % of the applications stable?
4. Are the hardware configuration optimized?
5. Type of Database used for real-time fault monitoring?
6. Support for summarizing and archiving of DB?
7. Do the current applications support multiple platforms?
8. Do the current applications support multiple databases?
9. Do the current applications support database partitioning based on
geography or business line or any other regulations? 10.Do the applications support parallel processing? 11 .Disaster recovery and backup support? 12.Is the information available in log files for successful, error, failed
transactions and system crashes? 13.Support for Single Sign on?
14.Support on standard desk-top configuration for application users ? 15.Do the applications support multiple instances on the same
desktop? 16.Debugging tools available OR the GUI supports debugging 17.Do you observer optimized performance from the time it takes to
launch and log into the application on the preferred desktop? 18.Do the application supports automation of testing by the use of
scripts? 19.1s the application tested for memory leaks and failure under
average, peak load? 20.Do the transactions are recovered or rolled back during failure? 21.Over an extended period of time, CI>TJ utilization, memory
consumption, disk consumption is within limits. 22.Do the application support storing of raw alarms? 23.Does the system support unattended automatic shutdown and
restart of core processes in case of component failure? 24.Capability to auto restart after application failure or power-up. 25.Can the application maintain data integrity in case of failure of a single component?
26.Does the system design support automatic rerouting and reconnection? For example, if a server supporting a client application fails, the client shall be able to reconnect and reroute through an alternative path.
27.Can the application generate error message data for every error condition that can occur within the application?
28.Do the applications provide utilities to cleanup system files?
- backup and recovery framework evaluation database means
(BRM), adapted to host a plurality of score receiving means,
typically in relation to the following:
1. Perform Hot Backup of management database without stopping or restarting the production
2. If so, how often?
3. Support for Backup and recovery of DB?
4. Is the disaster recovery plan for a NOC in place?
5. High availability through a second system?
6. Data consistency checks and cleansing performed on a regular basis to prevent corruption.
- security evaluation database means (SEM), adapted to host a
plurality of score receiving means, typically in relation to the
following:
1. What is the % of the application have standard security components and frame work for security management
2. What is the % of the application have option to provide access to users on various sub-modules of the system.
3. What is the % of the system ability to perform audit on the actions undertaken by the user
4. What is the % of the system ability to maintain a clear audit trail for later analysis
5. What is the % of the system have capability to integrate with 3rd Party security systems- Capability to integrate using methods like LDAP or user database control thru external systems or User security and rights APIs.
6. Does the application support operator authentication, command, menu restriction and operator privileges?
7. Do the applications have the facility of restricting the use of certain commands or procedures to certain passwords and terminals?
8. Do all servers in the NOC protected by firewall and IDS (Intrusion Detection System)?
- performance evaluation database means (PDM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Does the server load capacity exceed 20%?
2. Does the peak server load capacity exceed 40%?
3. Can the present sizing take care of 100%) growth for Y0+1 year, where Y0 is the present sizing requirement?
4. What is the % of the applications supports anticipated conditions of congestion (for example, burst traffic that are typical to a region and focused demands that result from promotions and surveys).
5. Does the current hardware and software configuration allow future expansion and able to handle maintenance with minimum disruption to the production environment.
6. Does the System support flexibility for optimal load distribution on servers?
7. Does the system performance degrade when max number of users use the system simultaneously?
8. Does the system performance degrade when maximum number of users run reports?
9. Does the system Support number of transactions per day (X transactions, Y faults per day, Z network elements or circuits, Support for multi-users)
- integration capabilities database means (ICM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Do all the applications have standard application programming interfaces (APIs)?
2. Do the applications support industry-standard SNMP protocol to manage NE interface?
3. Do the applications provide web services out of the box?
4. Do all the applications have ready-to-go adaptors for ESB?
5. Are the application(s) capable of interfacing with OSS application hosted on any other platform?
6. Capability to support integration that is real time, batch, or near real time.
7. Capability to support multiple data interchange protocols like FTP, RPC, RMI etc.
- Adherence to industry standards evaluation database means
(ISM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Are the applications compliant to tmforum's NGOSS requirements?
2. Capability to support OSS/J or MTOSI?
3. Capability to support OS A/Parlay APIs?
4. Are the application componentized?
5. Do all the application have business logic separate from business process flow?
6. Do all the application uses an ESB / Workflow engine for all process and integration requirements?
According to an embodiment of this invention, said cost derivation means (CDM) includes:
- License cost database means (LCM), adapted to host a plurality of
score receiving means, typically in relation to the following:
1. Do the systems support multiple licensing models?
- System integration cost database means (SICM), adapted to host a
plurality of score receiving means, typically in relation to the
following:
1. What is the % of Cost spent in configuration?
2. What is the % of cost in core customization?
3. What is the % of cost in building functional wrappers or adopters?
4. What is the % of cost in Professional services from vendor during development?
5. What is the % of cost in training users?
- Annual Maintenance cost database means (AMCM), adapted to
host a plurality of score receiving means, typically in relation to the
following;
1. What is the % of cost in extending warranty?
2. What is the % of cost in maintenance?
3. What is the % of cost in support [Number of changes delivered, etc.]?
4. What is the % of cost spent on support staff [Required for maintenance]?
5. Do the vendors deliver core customizations as part of AMC?
- Upgrade cost database means (UCM), adapted to host a plurality of
score receiving means, typically in relation to the following:
1. Various upgrades plans and options available?
2. What is the % of cost spent in upgrades?
3. What is the % of cost in data migration?
4. What is the % cost of no upgrade of the product?
- Migration (to a different platform) cost database means (MCM),
adapted to host a plurality of score receiving means, typically in
relation to the following:
1. What is the % of Cost in software upgrade?
2. What is the % of Cost in professional services?
3. What is the % of Cost in deploying adaptors?
4. What is the % cost of software/hardware?
5. What is the % of Cost in License, maintenance, etc.?
6. What is the % of Cost in Warranty?
According to an embodiment of this invention, said vendor support means
(VSM) includes:
- database means (VSDM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Do all the systems cover under warranty support?
2. Is the contract covered under 24 * 7?
3. Is 1-800 number provided for support?
4. Do the support available both at On-site and Off-site?
5. Are the user trained on the products?
6. Is local support provided?
7. Is the SLAs defined and agreed for Vendor Support?
8. % of vendor revenue from services Vs Licenses
9. Are the product documents available?
10.Are the internal architecture and design made available?
11 .Is the source code made available?
12. Are the adapters or Implementation accelerators made available?
13.Brand image of the vendor?
According to an embodiment of this invention, said product roadmap means (PRM) includes:
- database means (PRDM), adapted to host a plurality of score receiving means, typically in relation to the following:
1. Product's current version's strength
2. Vendor's history in mandating upgrades to the product
3. Is the product road map available?
4. Number of years in future that the roadmap covers?
5. Do the number of releases covered in roadmap?
- Does the vendor's past history show commitment to the
roadmap?
6. IT technology imperatives addressed by the roadmap
.- SOA and web services -BPEL
7. Industry standards addressed by the roadmap
- TMFORUM, NGOSS, OSS/J, SID, TNA etc.
- These could include support for MTOSI specifications for integrating with the EMS layers for discovery and activation etc.
8. Does the product address current business drivers?
9. Does the product roadmap address the new Services and features? 10.Does the product roadmap address the Telecom network
technology drivers? 11.What is vendor promised average system availability of the
system? 12.Is there failover support with distributed server technology?
In accordance with an additional embodiment of this invention, there is provided a ranking means (RNKM) adapted to provide a ranking score in relation to pre-defined parameters and the various scores received. A maturity level of the system and process in use, as well as the assessment can be thoroughly evaluated.
Weighted Maturity Rating is a measure of the areas where greater priority needs to be given in a system or an enterprise, in relation to its plurality of embodiments.
Weighted Maturity Rating = ((5 - Unweighted Maturity Level) * Assigned Category Weighting) * 100)
Where, Unweighted Maturity Level and Assigned Category Weighting is obtained from the Aggregation Means (AGGM) and the Ranking Means (RNKM).
In accordance with still an additional embodiment of this invention, there is provided an Intelligent Decision Making Tool (IDMT) to read the rank or scores from the Ranking Means (RKKM) and the Aggregation Means (AGGM), and provide a feedback (F/B) to the various means () to provide a customized output.
While considerable emphasis has been placed herein on the particular features of this invention, it will be appreciated that various modifications can be made, and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other modifications in the nature of the invention or the preferred
embodiments will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.
MOHAM DEWAN,
OF R.K. DEWAN & co. APPLICANTS' PATENT ATTORNEY
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 1171-MUM-2009-FORM 18(26-11-2010).pdf | 2010-11-26 |
| 1 | 1171-MUM-2009-ORIGINAL UR 6(1A) FORM 26-210619.pdf | 2019-07-16 |
| 2 | 1171-MUM-2009-CORRESPONDENCE(26-11-2010).pdf | 2010-11-26 |
| 2 | 1171-MUM-2009-Written submissions and relevant documents (MANDATORY) [10-07-2019(online)].pdf | 2019-07-10 |
| 3 | Petition Under Rule 137 [24-08-2016(online)].pdf | 2016-08-24 |
| 3 | 1171-MUM-2009-AMMENDED DOCUMENTS [08-07-2019(online)].pdf | 2019-07-08 |
| 4 | Other Document [24-08-2016(online)].pdf_179.pdf | 2016-08-24 |
| 4 | 1171-MUM-2009-FORM 13 [08-07-2019(online)].pdf | 2019-07-08 |
| 5 | Other Document [24-08-2016(online)].pdf | 2016-08-24 |
| 5 | 1171-MUM-2009-MARKED COPIES OF AMENDEMENTS [08-07-2019(online)].pdf | 2019-07-08 |
| 6 | Examination Report Reply Recieved [24-08-2016(online)].pdf | 2016-08-24 |
| 6 | 1171-MUM-2009-FORM-26 [20-06-2019(online)].pdf | 2019-06-20 |
| 7 | Description(Complete) [24-08-2016(online)].pdf | 2016-08-24 |
| 7 | 1171-MUM-2009-HearingNoticeLetter.pdf | 2019-05-30 |
| 8 | Correspondence [24-08-2016(online)].pdf | 2016-08-24 |
| 8 | 1171-MUM-2009-ABSTRACT(29-4-2010).pdf | 2018-08-10 |
| 9 | 1171-MUM-2009-CLAIMS(29-4-2010).pdf | 2018-08-10 |
| 9 | Claims [24-08-2016(online)].pdf | 2016-08-24 |
| 10 | 1171-MUM-2009-CORRESPONDENCE(23-6-2010).pdf | 2018-08-10 |
| 10 | abstract1.jpg | 2018-08-10 |
| 11 | 1171-MUM-2009-CORRESPONDENCE(29-4-2010).pdf | 2018-08-10 |
| 11 | 1171-MUM-2009_EXAMREPORT.pdf | 2018-08-10 |
| 12 | 1171-mum-2009-correspondence.pdf | 2018-08-10 |
| 12 | 1171-MUM-2009-FORM 5(29-4-2010).pdf | 2018-08-10 |
| 13 | 1171-MUM-2009-DESCRIPTION(COMPLETE)-(29-4-2010).pdf | 2018-08-10 |
| 13 | 1171-mum-2009-form 3.pdf | 2018-08-10 |
| 14 | 1171-mum-2009-form 26.pdf | 2018-08-10 |
| 15 | 1171-mum-2009-description(provisional).pdf | 2018-08-10 |
| 15 | 1171-mum-2009-form 2.pdf | 2018-08-10 |
| 16 | 1171-MUM-2009-DRAWING(29-4-2010).pdf | 2018-08-10 |
| 17 | 1171-mum-2009-drawing.pdf | 2018-08-10 |
| 17 | 1171-mum-2009-form 2(title page).pdf | 2018-08-10 |
| 18 | 1171-MUM-2009-FORM 2(TITLE PAGE)-(29-4-2010).pdf | 2018-08-10 |
| 18 | 1171-MUM-2009-FORM 1(23-6-2010).pdf | 2018-08-10 |
| 19 | 1171-mum-2009-form 1.pdf | 2018-08-10 |
| 19 | 1171-mum-2009-form 2(29-4-2010).pdf | 2018-08-10 |
| 20 | 1171-mum-2009-form 1.pdf | 2018-08-10 |
| 20 | 1171-mum-2009-form 2(29-4-2010).pdf | 2018-08-10 |
| 21 | 1171-MUM-2009-FORM 1(23-6-2010).pdf | 2018-08-10 |
| 21 | 1171-MUM-2009-FORM 2(TITLE PAGE)-(29-4-2010).pdf | 2018-08-10 |
| 22 | 1171-mum-2009-drawing.pdf | 2018-08-10 |
| 22 | 1171-mum-2009-form 2(title page).pdf | 2018-08-10 |
| 23 | 1171-MUM-2009-DRAWING(29-4-2010).pdf | 2018-08-10 |
| 24 | 1171-mum-2009-description(provisional).pdf | 2018-08-10 |
| 24 | 1171-mum-2009-form 2.pdf | 2018-08-10 |
| 25 | 1171-mum-2009-form 26.pdf | 2018-08-10 |
| 26 | 1171-mum-2009-form 3.pdf | 2018-08-10 |
| 26 | 1171-MUM-2009-DESCRIPTION(COMPLETE)-(29-4-2010).pdf | 2018-08-10 |
| 27 | 1171-mum-2009-correspondence.pdf | 2018-08-10 |
| 27 | 1171-MUM-2009-FORM 5(29-4-2010).pdf | 2018-08-10 |
| 28 | 1171-MUM-2009-CORRESPONDENCE(29-4-2010).pdf | 2018-08-10 |
| 28 | 1171-MUM-2009_EXAMREPORT.pdf | 2018-08-10 |
| 29 | 1171-MUM-2009-CORRESPONDENCE(23-6-2010).pdf | 2018-08-10 |
| 29 | abstract1.jpg | 2018-08-10 |
| 30 | 1171-MUM-2009-CLAIMS(29-4-2010).pdf | 2018-08-10 |
| 30 | Claims [24-08-2016(online)].pdf | 2016-08-24 |
| 31 | 1171-MUM-2009-ABSTRACT(29-4-2010).pdf | 2018-08-10 |
| 31 | Correspondence [24-08-2016(online)].pdf | 2016-08-24 |
| 32 | 1171-MUM-2009-HearingNoticeLetter.pdf | 2019-05-30 |
| 32 | Description(Complete) [24-08-2016(online)].pdf | 2016-08-24 |
| 33 | Examination Report Reply Recieved [24-08-2016(online)].pdf | 2016-08-24 |
| 33 | 1171-MUM-2009-FORM-26 [20-06-2019(online)].pdf | 2019-06-20 |
| 34 | Other Document [24-08-2016(online)].pdf | 2016-08-24 |
| 34 | 1171-MUM-2009-MARKED COPIES OF AMENDEMENTS [08-07-2019(online)].pdf | 2019-07-08 |
| 35 | Other Document [24-08-2016(online)].pdf_179.pdf | 2016-08-24 |
| 35 | 1171-MUM-2009-FORM 13 [08-07-2019(online)].pdf | 2019-07-08 |
| 36 | Petition Under Rule 137 [24-08-2016(online)].pdf | 2016-08-24 |
| 36 | 1171-MUM-2009-AMMENDED DOCUMENTS [08-07-2019(online)].pdf | 2019-07-08 |
| 37 | 1171-MUM-2009-Written submissions and relevant documents (MANDATORY) [10-07-2019(online)].pdf | 2019-07-10 |
| 37 | 1171-MUM-2009-CORRESPONDENCE(26-11-2010).pdf | 2010-11-26 |
| 38 | 1171-MUM-2009-ORIGINAL UR 6(1A) FORM 26-210619.pdf | 2019-07-16 |
| 38 | 1171-MUM-2009-FORM 18(26-11-2010).pdf | 2010-11-26 |