Abstract: The disclosure relates to method (300) and system (100) for securely processing and transmitting data of legacy devices (102) to a cloud server (110) is disclosed. The method (300) includes receiving (302) data from a legacy device. The data includes at least one video and a plurality of predefined data associated with each of the at least one video. The method (300) includes splitting (304) the at least one video into a plurality of video frames. The method (300) includes determining (306) a resolution associated with each of the plurality of video frames. The method (300) includes identifying (308) a matching template from a plurality of prestored templates based on the identified resolution and details of the legacy device. The method (300) further includes masking (310) at least one portion of each of the plurality of video frames based on the matching template identified.
[001] This disclosure relates generally to secure transmission of data, and more particularly to method and system for securely processing and transmitting data of legacy devices to a cloud server.
Background
[001] In current times, most of the diagnostic devices, such as medical devices may have Internet of Things (IoT) capabilities or may be connected with some network or the Internet for remote transmission of data captured therein. However, a lot of legacy medical devices that are still being used in large number, especially in underdeveloped and developing countries. These legacy devices lack the capability to remotely transmit data to remote locations or devices. Moreover, the legacy devices do not have the capability to identify and protect sensitive data captured therein. To further elaborate, accessibility of data from such legacy devices is a challenging task, as these legacy devices may not be connected to any cloud server or the Internet. Conventional systems fail to perform remote transmission of data from existing legacy devices, while ensuring protection of sensitive data captured by these legacy devices.
SUMMARY
[002] In an embodiment, a method for securely processing and transmitting data of legacy devices to a cloud server is disclosed. In one example, the method may include receiving data from a legacy device. The data includes at least one video and a plurality of predefined data associated with each of the at least one video. The method may further include splitting the at least one video into a plurality of video frames. The method may further include determining a resolution associated with each of the plurality of video frames. The method may further include identifying for each of the plurality of video frames, a matching template from a plurality of prestored templates based on the identified resolution and details of the legacy device. Each of the plurality of prestored templates is mapped to a predefined resolution and legacy device. The method may further include masking at least one portion of each of the plurality of video frames based on the matching template identified. Each of the plurality of prestored templates include information related to location of sensitive data and non-sensitive data within a corresponding video frame, and the masked at least one portion corresponds to the sensitive data.
[003] In another embodiment, a system for securely processing and transmitting data of legacy devices to a cloud server is disclosed. In one example, the system may include a gateway device comprising a processor and a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, may cause the processor to receive data from a legacy device. The data includes at least one video and a plurality of predefined data associated with each of the at least one video. The processor-executable instructions, on execution, may further cause the processor to split the at least one video into a plurality of video frames. The processor-executable instructions, on execution, may further cause the processor to determine a resolution associated with each of the plurality of video frames. The processor-executable instructions, on execution, may further cause the processor to identify for each of the plurality of video frames, a matching template from a plurality of prestored templates based on the identified resolution and details of the legacy device. Each of the plurality of prestored templates is mapped to a predefined resolution and legacy device. The processor-executable instructions, on execution, may further cause the processor to mask at least one portion of each of the plurality of video frames based on the matching template identified. Each of the plurality of prestored templates include information related to location of sensitive data and non-sensitive data within a corresponding video frame, and the masked at least one portion corresponds to the sensitive data.
[004] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[005] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles.
[006] FIG. 1 is a block diagram illustrating a system for securely processing and transmitting data of legacy devices to a cloud server, in accordance with some embodiments.
[007] FIG. 2 is a block diagram illustrating various modules within a memory of a gateway device configured to securely process and transmit data of legacy devices to a cloud server, in accordance with some embodiments.
[008] FIG. 3 illustrates a flowchart of a method for securely processing and transmitting data of legacy devices to a cloud server, in accordance with some embodiments.
[009] FIG. 4 illustrates secure processing and transmission of data of legacy devices to a cloud server, in accordance with some exemplary embodiments.
[010] FIG. 5 illustrates a flowchart of a method for transmitting data of legacy devices to a cloud server, in accordance with some embodiments.
[011] FIG. 6 illustrates a flowchart of a method for transmitting data of legacy devices to a cloud server, in accordance with some other embodiments.
[012] FIG. 7 illustrates a flowchart of a method for assigning tags to masked portions and unmasked portions in video frames, in accordance with some embodiments.
[013] FIG. 8 illustrates a flowchart of a method for providing access to a user of masked portions and unmasked portions of video frames, in accordance with some embodiments.
[014] FIG. 9 illustrates a flowchart of a method for evaluating user access to masked portions and unmasked portions of video frames, in accordance with some embodiments.
[015] FIG. 10 illustrates a flowchart of a method for evaluating user access to masked portions and unmasked portions of video frames, in accordance with some embodiments
[016] FIG. 11 illustrates selective access granted to authorized and unauthorized users of sensitive data from video frames stored in a cloud server, in accordance with some embodiments.
[017] FIG. 12 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
DETAILED DESCRIPTION
[018] Exemplary embodiments are described with reference to the accompanying drawings. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims. Additional illustrative embodiments are listed below.
[019] Referring now to FIG. 1, a block diagram of a system 100 for securely processing and transmitting data of one or more legacy devices 102 to a cloud server 110 is illustrated, in accordance with some embodiments. The system 100 may include a gateway device 104 that may be configured to securely process and transmit data of the one or more legacy devices 102 to the cloud server 106. Examples of the gateway device 104 may include, but are not limited to a hub, a switch, a bridge, a router, a modem, or a repeater. The legacy devices 102 may be medical devices that may include, but are not limited to an electrocardiogram (ECG), an Electronic Fetal Monitoring (EFM), an echocardiogram (ECHO), an electroencephalogram (EEG), an ultrasound, a Magnetic Resonance Imaging (MRI), a computerized tomography (CT), a cardiology device, and a blood pressure monitoring device. The one or more legacy devices 102 may not have inbuilt wireless connectivity. Each of the one or more legacy devices 102 may be communicatively connected to the gateway device 104 via a wired connection, for example, a High-Definition Multimedia Interface (HDMI) cable or a Video Graphics Array (VGA) cable. The wired connection may be configured to capture data (for example, medical data) from the one or more legacy devices 102. The data may be multimedia data, for example, video data, audio data, images, or a combination thereof.
[020] Additionally, the gateway device 104 and the cloud server 110 may be communicatively coupled with each other through a network (not shown in FIG. 1). The network may be a wired or a wireless network and the examples may include, but are not limited to the Internet, Wireless Local Area Network (WLAN), Wi-Fi, Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WiMAX), Bluetooth, and General Packet Radio Service (GPRS). The gateway device 104 may receive a plurality of predefined data from the one or more legacy devices 102 and may share the same with the cloud sever 110. The plurality of predefined data may include, but is not limited to Identifiers (IDs) of the one or more legacy devices 102, location of the one or more legacy devices 102, names of the one or more legacy devices 102, make or model of the one or more legacy devices, or current day and time.
[021] The gateway device 104 may include a processor 106 that is communicatively coupled to a memory 108, which may be a non-volatile memory or a volatile memory. Examples of the non-volatile memory, may include, but are not limited to a flash memory, a Read Only Memory (ROM), a Programmable ROM (PROM), Erasable PROM (EPROM), and Electrically EPROM (EEPROM) memory. Examples of the volatile memory may include, but are not limited Dynamic Random Access Memory (DRAM), and Static Random-Access Memory (SRAM). The memory 108 may further include various modules that enable the gateway device 104 to securely process and transmit data received from the one or more legacy devices 102 to the cloud server 110. These modules are explained in detail in conjunction with FIG. 2.
[022] The gateway device 104 may include a display 112 that includes a user interface 114. A user may interact with the gateway device 104 via the user interface 114. Thus, the user interface 114 may allow the user to remotely access the data of each of the one or more legacy devices 102 via the gateway device 104.
[023] The system 100 may further include electronic devices 116 that may be communicatively coupled to the cloud server 110 and/or to the gateway device 104. Examples of the electronic devices 116 may include, but are not limited to a smart phone, a laptop, a tablet, a desktop or any other smart device. The electronic device 116 may thus be able to access data captured by the one or more legacy device 102 via the cloud server 110 or the gateway device 104.
[024] Referring now to FIG. 2, a block diagram of various modules within the memory 108 of the gateway device 104 configured to securely process and transmit data of legacy devices to the cloud server 110 is illustrated, in accordance with some embodiments. The memory 108 includes a Virtual Network Computing (VNC) enabling module 202, a video matching module 204, a template matching module 206, a masking module 208, a water marking module 210, a geo matching module 212, and a workflow management module 214.
[025] The VNC enabling module 202 may be configured to wirelessly transmit the data received from the one or more legacy devices 102 to the cloud server 106 via a network. The VNC enabling module 202 may further allow a user to remotely perform operations on the data captured by each of the one or more legacy devices 102 through a keyboard and a mouse simulator for further processing. The data may include at least one video and a plurality of predefined data associated with each of the at least one video. In some embodiments, the at least one video may be captured and stored in the memory 108 for further processing. The predefined data may include at least one of Identifier (ID) of the legacy device, location of the legacy device, name of the legacy device, make or model of the legacy device, or the current day and time.
[026] The video matching module 204 may split each of the at least one video into a plurality of video frames (for example, into multiple screens) and may further determine a resolution associated with each of the plurality of video frames. By way of an example, the video matching module 204 may classify the at least one video into multiple screens irrespective of the resolution being captured and may further match each of the multiple screens with the corresponding resolutions. It may be noted that the at least one video may be processed in streams and may help in handling storage or memory constraints.
[027] For each of the plurality of video frames, the template matching module 206 may identify a matching template from a plurality of prestored templates based on the identified resolution and details of the legacy device. For example, the at least one video of the one or more legacy devices may be bifurcated as multiple video frames (or screens). The template matching module 206 may intercept these video frames and compare with the plurality of prestored templates. Each of the plurality of prestored templates is mapped to a predefined resolution and legacy device. Based on comparison, the template matching module 206 may find exact template match along with the associated data for further processing. It may be noted that the video frames that are not matching may not be selected for further processing.
[028] The masking module 208 may mask at least one portion of each of the plurality of video frames based on the matching template identified. Each of the plurality of prestored templates may include information related to location of sensitive data and non-sensitive data within a corresponding video frame. It may be noted that the masked at least one portion corresponds to the sensitive data. As an example, the masking module 208 may only mask the at least one portion of each of the plurality of video frames which may be include the sensitive data. The information related to location of the at least one portion which may include the sensitive data and location of the at least one portion which may include the non-sensitive data within the video frame may already be known and stored in the prestored template. The sensitive data may correspond to Personally Identifiable Information (PII) data, and the non-sensitive data may correspond to Non-PII data. This is further explained in detail in conjunction with FIG. 4.
[029] The water marking module 210 may mark tags on each of the masked at least one portion and each of the unmasked at least one portion. For example, a first tag may be water marked to the sensitive data and a second tag may be water marked to the non-sensitive data within the corresponding video frame. By way of an example, when the legacy device is a medical device (for example, a cardiology device) and patient data associated with the medical device is considered to be sensitive, then the patient data may be watermarked as “sensitive data”.
[030] Further, the geo matching module 212 may identify a predefined data associated with each of the one or more legacy devices 102 that may include, but are not to, at least one of Identifier (ID) of the legacy device, location of the legacy device, name of the legacy device, make or model of the legacy device, or current day and time. It may be noted that such predefined data may be stored in the memory 108 and may further be transmitted to the cloud server 110 whenever requested by at least one user.
[031] The workflow managing module 214 may be configured to manage the PII and non-PII data associated with each of the plurality of video frames. In some embodiments, the workflow managing module 214 may include a local admin that may have a right of controlling the PII and non-PII data via a web browser.
[032] The memory 108 may further include a data switch 216. The data switch 216 may be an internal part of the gateway device 104 that may have a right to determine which data, for example, whether the PII data 218, non-PII data 220 or both are to be transmitted to the cloud server 110 based on information received from the workflow manager 214. This is further explained in conjunction with FIG. 4.
[033] It should be noted that all such aforementioned modules 202 – 214 may be represented as a single module or a combination of different modules. Further, as will be appreciated by those skilled in the art, each of the modules 202 – 214 may reside, in whole or in parts, on one device or multiple devices in communication with each other. In some embodiments, each of the modules 202 – 214 may be implemented as dedicated hardware circuit comprising custom application-specific integrated circuit (ASIC) or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. Each of the modules 202 – 214 may also be implemented in a programmable hardware device such as a field programmable gate array (FPGA), programmable array logic, programmable logic device, and so forth. Alternatively, each of the modules 202 – 214 may be implemented in software for execution by various types of processors (e.g., processor 106). An identified module of executable code may, for instance, include one or more physical or logical blocks of computer instructions, which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executables of an identified module or component need not be physically located together, but may include disparate instructions stored in different locations which, when joined logically together, include the module and achieve the stated purpose of the module. Indeed, a module of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
[034] As will be appreciated by one skilled in the art, a variety of processes may be employed for securely processing and transmitting data of legacy devices to a cloud server. For example, the exemplary system 100 and the associated gateway device 104 may securely process and transmit data of legacy devices to a cloud server by the processes discussed herein. In particular, as will be appreciated by those of ordinary skill in the art, control logic and/or automated routines for performing the techniques and steps described herein may be implemented by the system 100 and the associated gateway device 102 either by hardware, software, or combinations of hardware and software. For example, suitable code may be accessed and executed by the one or more processors on the system 100 to perform some or all of the techniques described herein. Similarly, application specific integrated circuits (ASICs) configured to perform some or all of the processes described herein may be included in the one or more processors on the system 100.
[035] Referring now to FIG. 3 a flowchart of a method 300 for securely processing and transmitting data of legacy devices to a cloud server is illustrated, in accordance with some embodiments. The method 300 may be implemented by the gateway device 104 of the system 100. At step 302, data is received from a legacy device. It may be noted that the data includes at least one video and a plurality of predefined data associated with each of the at least one video. In some embodiments, the at least one video and the plurality of predefined data associated with each of the at least one video are received through an HDMI/VGA cable.
[036] At step 304, the at least one video is split into a plurality of video frames (or screens). It may be noted that, in some embodiments the at least one video may be split into multiple frames irrespective of resolutions being received. At step 306, a resolution associated with each of the plurality of video frames may be determined. Based on the identified resolution and details of the legacy device, at step 308, a matching template is identified for each of the plurality of video frames from a plurality of prestored templates. Each of the plurality of prestored templates may be mapped to a predefined resolution and legacy device.
[037] At step 310, at least one portion of each of the plurality of video frames is masked based on the matching template identified. Each of the plurality of prestored templates may include information related to location of sensitive data and non-sensitive data within a corresponding video frame. The information, for example, may include coordinates of the sensitive data and the non-sensitive data within corresponding video frame. It may be noted that the masked at least one portion corresponds to the sensitive data.
[038] In some embodiments, the gateway device 104 may transmit each of the plurality of video frames to a cloud server in response to masking. The transmitting of each of the plurality of video frames is explained in detail in conjunction with FIG. 5 and FIG. 6.
[039] Referring now to FIG. 4, a secure processing and transmission of data of legacy devices to a cloud server is illustrated, in accordance with some exemplary embodiments. The system 400 may include one or more legacy devices 402, a gateway device 404, a cloud server 410, and a user interface 412. In an embodiment, the one or more legacy devices 402, the gateway device 404, the cloud server 410, and the user interface 412 may be respectively analogous to the one or more legacy devices 102, the gateway device 104, the cloud server 110, and the user interface 114 of the system 100. As mentioned earlier, each of the plurality of prestored templates may include information related to location of sensitive data and non-sensitive data within a corresponding video frame. The video frame 406 may be matched with a pre-stored template and based on that location of sensitive data 406a and non-sensitive data 406b may be identified within a video frame 406, as is depicted in FIG. 4. This has already explained in detail in conjunction with FIGs. 1, 2, and 3.
[040] The gateway device 404 may mask the portion of the video frame 406 that may include the sensitive data 404a, while the portion that includes the non-sensitive data 404b may be left unmasked by the gateway device 404.The video frame 406 may then be transmitted to the cloud server 410 after the masking. To further elaborate, the gateway device 404 may include a data switch 408 that may be configured to switch the data between PII and Non-PII. It may be noted that the sensitive data may correspond to the PII data, and the non-sensitive data may correspond to the Non-PII data. Thus, when the data switch 408 enables a non-PII mode, only the non-sensitive data 406b may be transmitted to the cloud server 410. Only when the data switch 408 enables a PII mode, the sensitive data 406a may be transmitted to the cloud server 410.
[041] It may be noted that the sensitive data 406a and the non-sensitive data 406b of the video frame 406 transmitted to the cloud server 410 may be stored in a separate database. For example, the sensitive data 406a may be stored in a PII specific database (not shown FIG. 4) and the non-sensitive data 406b may be stored in a non-PII specific database (not shown FIG. 4) within the cloud server 410. Each of the plurality of video frames transmitted to the cloud server 410 may further be accessed by at least one user through the user interface 412. In some embodiments, access to each of the plurality of video frames may be based on Role-based access control (RBAC). This is further explained in conjunction with FIG. 8.
[042] In some embodiments, each of the plurality of video frames that may located in the gateway device 404 may be directly accessed by the at least one user through the user interface 412 without transmitting into the cloud server 410. The user interface 412 may further generate a report which may be shared with the at least one user. The report may include the plurality of video frames and the associated ID of the legacy device, location of the legacy device, name of the legacy device, make or model of the legacy device, or current day and time.
[043] Referring now to FIG. 5, a flowchart of a method 500 for transmitting data of legacy devices to a cloud server is illustrated, in accordance with some embodiments. As mentioned above in FIG. 3, the at least one portion of each of the plurality of video frames may be masked based on the matching template identified. At step 502, each of the plurality of video frames may be transmitted to the cloud server 110 in response to masking.
[044] The step 502 further includes a step 502a and a step 502b. At step 502a, the masked at least one portion of each of the plurality of video frames may be retained by the gateway device 104 before transmission. At step 502b, an unmasked at least one portion from each of the plurality of video frames may be transmitted from the gateway device 104 to the cloud server 110. It may be noted that the masked at least one portion corresponds to sensitive data and the unmasked at least one portion corresponds to non-sensitive data. It may further be noted that the transmitting and retaining of each of the plurality of video frames may be controlled by the data switch 408.
[045] Referring now to FIG. 6, a flowchart of a method 600 for transmitting data of legacy devices to a cloud server is illustrated, in accordance with some embodiments. At step 602, each of the plurality of video frames may be transmitted to a cloud server 110 . In this embodiment, the masked at one least portion and the unmasked at least one portion of each of the plurality of video frames may be simultaneously transmitted from the gateway device 104 to the cloud server 110, at step 602a.
[046] In an exemplary scenario, when user may have access to only the unmasked at least one portion, then the data switch 408 may retain the masked at least one portion of each of the plurality of video frames within the gateway device 104 and transmit the unmasked at least one portion from each of the plurality of video frames from the gateway device 404 to the cloud server 410.
[047] In another exemplary scenario, when user may have access to the masked at least one portion and the unmasked at least one portion, then the data switch 408 may simultaneously transmit the masked at one least portion and the unmasked at least one portion of each of the plurality of video frames from the gateway device 404 to the cloud server 410.
[048] Referring now to FIG. 7, a flowchart of a method 700 for assigning tags to masked portions and unmasked portions in video frames is illustrated, in accordance with some embodiments. Upon simultaneously transmitting the masked at least one portion and the unmasked at least one portion of each of the plurality of video frames from the gateway device 104 to the cloud server 110. At step 702, for each of the plurality of video frames stored in the cloud server 110, a first tag may be assigned to each of the masked at least one portion and a second tag may be assigned to each of the unmasked at least one portion. It may be noted that, the masked at least one portion corresponds to the sensitive data and the unmasked at least one portion corresponds to the non-sensitive data.
[049] Thereafter, based on the assigned first and second tags, for each of the plurality of video frames stored in the cloud server 110, access to at least one of: the masked at least one portion and the unmasked at least one portion may be provided to at least one user, at step 704. In other words, the first and second tags may be used to determine whether the at least one user have access to the masked at least one portion. This is further explained in conjunction with FIG. 8.
[050] Referring now to FIG. 8, flowchart of a method for providing access to a user of masked portions and unmasked portions of video frames is illustrated, in accordance with some embodiments. At step 802, an access to the masked at least one portion may be provided to a user from the at least one user. The step 802 further includes a step 802a, 802b, and 802c. In an embodiment, the access to the user may be based on Role-based access control (RBAC) that includes, at step 802a, a role associated with the user may be identified. At step 802b, access rights associated with the role may be determined. At step 802c, the user access to at least one of the masked at least one portion and the unmasked at least one portion associated with a video frame from the plurality of video frames may be evaluated, based on the determined access rights. The evaluating of the user access is further explained in conjunction with FIG. 9 and FIG. 10.
[051] Referring now to FIG. 9, a flowchart of a method for evaluating user access to masked portions and unmasked portions of video frames is illustrated, in accordance with some embodiments. At step 902, the user access to the masked at least one portion and the unmasked at least one portion is evaluated. In this case, the user does not have access to the masked at least one portion, At step 902a, the user access to the masked at least one portion may be denied and at step 902b, the user access to the unmasked at least one portion is granted. With reference to FIG. 10, at step 1002, a user’s access to the masked at least one portion and the unmasked at least one portion is evaluated. In this case, the user has access to the masked at least one portion. Thus, at step 1002a, the user may be granted access to the masked at least one portion and the unmasked at least one portion. This is further explained in detail in conjunction with FIG. 11.
[052] Referring now to FIG. 11, selective access granted to authorized and unauthorized users of sensitive data from video frames stored in a cloud server is illustrated. As mentioned earlier in FIG. 4, in response to masking, the video frame 406 may be transmitted from the gateway device 404 to the cloud server 410. The transmitted video frame 406 stored in the cloud server 410 is shown in FIG. 11. The video frame 406 may include the sensitive data 406a and the non-sensitive data 406b. It may be noted that tags may be assigned to each of the sensitive data and each of the non-sensitive data corresponding to the plurality of video frames. For example, a first tag may be assigned to the sensitive data 406a, and a second tag may be assigned to the non-sensitive data 406b of the video frame 406. The first and second tags may be assigned in order to enable role based access to the user.
[053] In an embodiment, upon presenting the video frame 406 to an authorized user 1102 and an unauthorized user 1104, a selective access of the sensitive data 406a from the video frame 406 stored in the cloud server 410 may be granted. By way of an example, when the video frame 406 is provided to the unauthorized user 1104, the sensitive data 406a may be masked based on identifying the tag associated with the sensitive data 406a and the unauthorized user 1104 may only have access to the non-sensitive data 406b.
[054] When the video frame 406 is provided to the authorized user 1102, as depicted, the authorized user 1102 may have access to both the sensitive data 406a and the non-sensitive data 406b. In other words, the sensitive data 404a may be masked only when accessed by the unauthorized user 1104. The authorized user 1102 may thus have access to the complete video frame 406 that includes both the sensitive data 406a and the non-sensitive data 406b.
[055] As will be also appreciated, the above described techniques may take the form of computer or controller implemented processes and apparatuses for practicing those processes. The disclosure can also be embodied in the form of computer program code containing instructions embodied in tangible media, such as floppy diskettes, solid state drives, CD-ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer or controller, the computer becomes an apparatus for practicing the invention. The disclosure may also be embodied in the form of computer program code or signal, for example, whether stored in a storage medium, loaded into and/or executed by a computer or controller, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
[056] The disclosed methods and systems may be implemented on a conventional or a general-purpose computer system, such as a personal computer (PC) or server computer. Referring now to FIG. 12, an exemplary computing system 1200 that may be employed to implement processing functionality for various embodiments (e.g., as a SIMD device, client device, server device, one or more processors, or the like) is illustrated. Those skilled in the relevant art will also recognize how to implement the invention using other computer systems or architectures. The computing system 1200 may represent, for example, a user device such as a desktop, a laptop, a mobile phone, personal entertainment device, DVR, and so on, or any other type of special or general-purpose computing device as may be desirable or appropriate for a given application or environment. The computing system 1200 may include one or more processors, such as a processor 1202 that may be implemented using a general or special purpose processing engine such as, for example, a microprocessor, microcontroller or other control logic. In this example, the processor 1202 is connected to a bus 1204 or other communication medium. In some embodiments, the processor 1202 may be an Artificial Intelligence (AI) processor, which may be implemented as a Tensor Processing Unit (TPU), or a graphical processor unit, or a custom programmable solution Field-Programmable Gate Array (FPGA).
[057] The computing system 1200 may also include a memory 1206 (main memory), for example, Random Access Memory (RAM) or other dynamic memory, for storing information and instructions to be executed by the processor 1202. The memory 1206 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 1202. The computing system 1200 may likewise include a read only memory (“ROM”) or other static storage device coupled to bus 1204 for storing static information and instructions for the processor 1202.
[058] The computing system 900 may also include a storage devices 1208, which may include, for example, a media drive 1210 and a removable storage interface. The media drive 1210 may include a drive or other mechanism to support fixed or removable storage media, such as a hard disk drive, a floppy disk drive, a magnetic tape drive, an SD card port, a USB port, a micro USB, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive. A storage media 1212 may include, for example, a hard disk, magnetic tape, flash drive, or other fixed or removable medium that is read by and written to by the media drive 1210. As these examples illustrate, the storage media 1212 may include a computer-readable storage medium having stored therein particular computer software or data.
[059] In alternative embodiments, the storage devices 1208 may include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into the computing system 1200. Such instrumentalities may include, for example, a removable storage unit 1214 and a storage unit interface 1216, such as a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, and other removable storage units and interfaces that allow software and data to be transferred from the removable storage unit 1214 to the computing system 1200.
[060] The computing system 1200 may also include a communications interface 1218. The communications interface 1218 may be used to allow software and data to be transferred between the computing system 1200 and external devices. Examples of the communications interface 1218 may include a network interface (such as an Ethernet or other NIC card), a communications port (such as for example, a USB port, a micro USB port), Near field Communication (NFC), etc. Software and data transferred via the communications interface 1218 are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by the communications interface 1218. These signals are provided to the communications interface 1218 via a channel 1220. The channel 1220 may carry signals and may be implemented using a wireless medium, wire or cable, fiber optics, or other communications medium. Some examples of the channel 1220 may include a phone line, a cellular phone link, an RF link, a Bluetooth link, a network interface, a local or wide area network, and other communications channels.
[061] The computing system 1200 may further include Input/Output (I/O) devices 922. Examples may include, but are not limited to a display, keypad, microphone, audio speakers, vibrating motor, LED lights, etc. The I/O devices 1222 may receive input from a user and also display an output of the computation performed by the processor 1202. In this document, the terms “computer program product” and “computer-readable medium” may be used generally to refer to media such as, for example, the memory 1206, the storage devices 1208, the removable storage unit 1214, or signal(s) on the channel 1220. These and other forms of computer-readable media may be involved in providing one or more sequences of one or more instructions to the processor 1202 for execution. Such instructions, generally referred to as “computer program code” (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system 1200 to perform features or functions of embodiments of the present invention.
[062] In an embodiment where the elements are implemented using software, the software may be stored in a computer-readable medium and loaded into the computing system 1200 using, for example, the removable storage unit 1214, the media drive 1210 or the communications interface 1218. The control logic (in this example, software instructions or computer program code), when executed by the processor 1202, causes the processor 1202 to perform the functions of the invention as described herein.
[063] As will be appreciated by those skilled in the art, the techniques described in the various embodiments discussed above are not routine, or conventional, or well understood in the art. The techniques discussed above provide for securely processing and transmitting data of legacy devices to a cloud server. The techniques may track and trace data of the legacy devices seamlessly and in a secured manner. The legacy devices may have a non-network connected environment. Therefore, the above-mentioned techniques may provide a gateway device for wirelessly connecting and transmitting the legacy devices to the cloud server in a secure manner. The techniques may provide masking to at least one portion of the transmitted data in order to protect sensitive data from accessing via an unauthorized user. The techniques may further provide maximum security, minimum configuration overhead on video or audio data, reduces overall storage space of a system without compromising data, maintains safety on the legacy devices and are not tampered at any point of time, and making the system robust for event traceability.
[064] In light of the above mentioned advantages and the technical advancements provided by the disclosed method and system, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the device itself as the claimed steps provide a technical solution to a technical problem.
[065] The specification has described method and system for identifying vulnerabilities and security risks in an application. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[066] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[067] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.
CLAIMS
We claim:
1. A method (300) for securely processing and transmitting data of legacy devices (102) to a cloud server (110), the method (300) comprising:
receiving (302), by a gateway device (104), data from a legacy device, wherein the data comprises at least one video and a plurality of predefined data associated with each of the at least one video;
splitting (304), by the gateway device (104), the at least one video into a plurality of video frames;
determining (306), by the gateway device (104), a resolution associated with each of the plurality of video frames;
identifying (308) for each of the plurality of video frames, by the gateway device (104), a matching template from a plurality of prestored templates based on the identified resolution and details of the legacy device, wherein each of the plurality of prestored templates is mapped to a predefined resolution and legacy device; and
masking (310), by the gateway device (104), at least one portion of each of the plurality of video frames based on the matching template identified, wherein each of the plurality of prestored templates comprise information related to location of sensitive data (406a) and non-sensitive data (406b) within a corresponding video frame, and wherein the masked at least one portion corresponds to the sensitive data (406a).
2. The method (300) as claimed in claim 1, wherein the predefined data comprises at least one of Identifier (ID) of the legacy device, location of the legacy device, name of the legacy device, make or model of the legacy device, or current day and time.
3. The method (300) as claimed in claim 1, further comprising transmitting (502) each of the plurality of video frames to a cloud server (110) in response to masking, wherein transmitting comprises one of:
retaining (502a) the masked at least one portion of each of the plurality of video frames within the gateway device;
transmitting (502b) an unmasked at least one portion from each of the plurality of video frames from the gateway device to the cloud server (110), wherein the unmasked at least one portion corresponds to non-sensitive data (406b); or
simultaneously transmitting (602a) the masked at one least portion and the unmasked at least one portion of each of the plurality of video frames from the gateway device to the cloud server (110).
4. The method (300) as claimed in claim 3, further comprising:
assigning (702), for each of the plurality of video frames, a first tag to each of the masked at least one portion and a second tag to each of the unmasked at least one portion;
providing (704) to at least one user, for each of the plurality of video frames stored in the cloud server (110), access to at least one of: the masked at least one portion and the unmasked at least one portion, wherein providing access (802) to a user from the at least one user comprises:
identifying (802a) a role associated with the user;
determining (802b) access rights associated with the role; and
evaluating (802c) the user access to at least one of the masked at least one portion and the unmasked at least one portion associated with a video frame from the plurality of video frames, based on the determined access rights.
5. The method as claimed in claim 4, wherein evaluating the user access comprises one of:
denying (902a) the user access to the masked at least one portion;
granting (902b) the user access to the unmasked at least one portion; or
granting (1002a) the user access to the masked at least one portion and the unmasked at least one portion.
6. The method (300) as claimed in claim 1, wherein the sensitive data (406a) corresponds to Personally Identifiable Information (PII) data, and wherein the non-sensitive data (406b) corresponds to Non-PII data.
7. A system (100) for securely processing and transmitting data of legacy devices (102) to a cloud server (110), the system (100) comprising:
a gateway device (104) comprising a processor (106) and a memory (108) communicatively coupled to the processor (106), wherein the memory (108) stores processor-executable instructions, which, on execution, causes the processor (106) to:
receive data from a legacy device, wherein the data comprises at least one video and a plurality of predefined data associated with each of the at least one video;
split the at least one video into a plurality of video frames;
determine a resolution associated with each of the plurality of video frames;
identify for each of the plurality of video frames, a matching template from a plurality of prestored templates based on the identified resolution and details of the legacy device, wherein each of the plurality of prestored templates is mapped to a predefined resolution and legacy device; and
mask at least one portion of each of the plurality of video frames based on the matching template identified, wherein each of the plurality of prestored templates comprise information related to location of sensitive data (406a) and non-sensitive data (406b) within a corresponding video frame, and wherein the masked at least one portion corresponds to the sensitive data (406a).
8. The system (100) as claimed in claim 7, wherein the predefined data comprises at least one of Identifier (ID) of the legacy device, location of the legacy device, name of the legacy device, make or model of the legacy device, or current day and time.
9. The system (100) as claimed in claim 7, wherein the processor (106) is further configured to transmit each of the plurality of video frames to a cloud server (110) in response to masking, wherein transmitting comprises one of:
retain the masked at least one portion of each of the plurality of video frames within the gateway device;
transmit an unmasked at least one portion from each of the plurality of video frames from the gateway device to the cloud server (110), wherein the unmasked at least one portion corresponds to non-sensitive data (406b); or
simultaneously transmit the masked at one least portion and the unmasked at least one portion of each of the plurality of video frames from the gateway device to the cloud server (110).
10. The system (100) as claimed in claim 7, wherein the processor (106) is further configured to:
assign, for each of the plurality of video frames, a first tag to each of the masked at least one portion and a second tag to each of the unmasked at least one portion;
provide to at least one user, for each of the plurality of video frames stored in the cloud server (110), access to at least one of: the masked at least one portion and the unmasked at least one portion, wherein to provide access to a user from the at least one user, the processor (106) is further configured to:
identify a role associated with the user;
determine access rights associated with the role; and
evaluate the user access to at least one of the masked at least one portion and the unmasked at least one portion associated with a video frame from the plurality of video frames, based on the determined access rights, wherein evaluating the user access, comprises one of:
deny the user access to the masked at least one portion;
grant the user access to the unmasked at least one portion; or
grant the user access to the masked at least one portion and the unmasked at least one portion.
| # | Name | Date |
|---|---|---|
| 1 | 202211009221-IntimationOfGrant20-02-2025.pdf | 2025-02-20 |
| 1 | 202211009221-STATEMENT OF UNDERTAKING (FORM 3) [21-02-2022(online)].pdf | 2022-02-21 |
| 1 | 202211009221-US(14)-HearingNotice-(HearingDate-22-01-2025).pdf | 2024-12-19 |
| 2 | 202211009221-REQUEST FOR EXAMINATION (FORM-18) [21-02-2022(online)].pdf | 2022-02-21 |
| 2 | 202211009221-PatentCertificate20-02-2025.pdf | 2025-02-20 |
| 2 | 202211009221-ABSTRACT [13-01-2023(online)].pdf | 2023-01-13 |
| 3 | 202211009221-CLAIMS [13-01-2023(online)].pdf | 2023-01-13 |
| 3 | 202211009221-REQUEST FOR EARLY PUBLICATION(FORM-9) [21-02-2022(online)].pdf | 2022-02-21 |
| 3 | 202211009221-Written submissions and relevant documents [06-02-2025(online)].pdf | 2025-02-06 |
| 4 | 202211009221-COMPLETE SPECIFICATION [13-01-2023(online)].pdf | 2023-01-13 |
| 4 | 202211009221-Correspondence to notify the Controller [17-01-2025(online)].pdf | 2025-01-17 |
| 4 | 202211009221-PROOF OF RIGHT [21-02-2022(online)].pdf | 2022-02-21 |
| 5 | 202211009221-POWER OF AUTHORITY [21-02-2022(online)].pdf | 2022-02-21 |
| 5 | 202211009221-FORM-26 [17-01-2025(online)].pdf | 2025-01-17 |
| 5 | 202211009221-CORRESPONDENCE [13-01-2023(online)].pdf | 2023-01-13 |
| 6 | 202211009221-US(14)-HearingNotice-(HearingDate-22-01-2025).pdf | 2024-12-19 |
| 6 | 202211009221-FORM-9 [21-02-2022(online)].pdf | 2022-02-21 |
| 6 | 202211009221-DRAWING [13-01-2023(online)].pdf | 2023-01-13 |
| 7 | 202211009221-FORM 18 [21-02-2022(online)].pdf | 2022-02-21 |
| 7 | 202211009221-FER_SER_REPLY [13-01-2023(online)].pdf | 2023-01-13 |
| 7 | 202211009221-ABSTRACT [13-01-2023(online)].pdf | 2023-01-13 |
| 8 | 202211009221-CLAIMS [13-01-2023(online)].pdf | 2023-01-13 |
| 8 | 202211009221-FER.pdf | 2022-07-14 |
| 8 | 202211009221-FORM 1 [21-02-2022(online)].pdf | 2022-02-21 |
| 9 | 202211009221-COMPLETE SPECIFICATION [13-01-2023(online)].pdf | 2023-01-13 |
| 9 | 202211009221-COMPLETE SPECIFICATION [21-02-2022(online)].pdf | 2022-02-21 |
| 9 | 202211009221-FIGURE OF ABSTRACT [21-02-2022(online)].jpg | 2022-02-21 |
| 10 | 202211009221-CORRESPONDENCE [13-01-2023(online)].pdf | 2023-01-13 |
| 10 | 202211009221-DECLARATION OF INVENTORSHIP (FORM 5) [21-02-2022(online)].pdf | 2022-02-21 |
| 10 | 202211009221-DRAWINGS [21-02-2022(online)].pdf | 2022-02-21 |
| 11 | 202211009221-DECLARATION OF INVENTORSHIP (FORM 5) [21-02-2022(online)].pdf | 2022-02-21 |
| 11 | 202211009221-DRAWING [13-01-2023(online)].pdf | 2023-01-13 |
| 11 | 202211009221-DRAWINGS [21-02-2022(online)].pdf | 2022-02-21 |
| 12 | 202211009221-COMPLETE SPECIFICATION [21-02-2022(online)].pdf | 2022-02-21 |
| 12 | 202211009221-FER_SER_REPLY [13-01-2023(online)].pdf | 2023-01-13 |
| 12 | 202211009221-FIGURE OF ABSTRACT [21-02-2022(online)].jpg | 2022-02-21 |
| 13 | 202211009221-FER.pdf | 2022-07-14 |
| 13 | 202211009221-FORM 1 [21-02-2022(online)].pdf | 2022-02-21 |
| 14 | 202211009221-FORM 18 [21-02-2022(online)].pdf | 2022-02-21 |
| 14 | 202211009221-FER_SER_REPLY [13-01-2023(online)].pdf | 2023-01-13 |
| 14 | 202211009221-COMPLETE SPECIFICATION [21-02-2022(online)].pdf | 2022-02-21 |
| 15 | 202211009221-DECLARATION OF INVENTORSHIP (FORM 5) [21-02-2022(online)].pdf | 2022-02-21 |
| 15 | 202211009221-DRAWING [13-01-2023(online)].pdf | 2023-01-13 |
| 15 | 202211009221-FORM-9 [21-02-2022(online)].pdf | 2022-02-21 |
| 16 | 202211009221-CORRESPONDENCE [13-01-2023(online)].pdf | 2023-01-13 |
| 16 | 202211009221-DRAWINGS [21-02-2022(online)].pdf | 2022-02-21 |
| 16 | 202211009221-POWER OF AUTHORITY [21-02-2022(online)].pdf | 2022-02-21 |
| 17 | 202211009221-COMPLETE SPECIFICATION [13-01-2023(online)].pdf | 2023-01-13 |
| 17 | 202211009221-FIGURE OF ABSTRACT [21-02-2022(online)].jpg | 2022-02-21 |
| 17 | 202211009221-PROOF OF RIGHT [21-02-2022(online)].pdf | 2022-02-21 |
| 18 | 202211009221-CLAIMS [13-01-2023(online)].pdf | 2023-01-13 |
| 18 | 202211009221-REQUEST FOR EARLY PUBLICATION(FORM-9) [21-02-2022(online)].pdf | 2022-02-21 |
| 18 | 202211009221-FORM 1 [21-02-2022(online)].pdf | 2022-02-21 |
| 19 | 202211009221-FORM 18 [21-02-2022(online)].pdf | 2022-02-21 |
| 19 | 202211009221-REQUEST FOR EXAMINATION (FORM-18) [21-02-2022(online)].pdf | 2022-02-21 |
| 19 | 202211009221-ABSTRACT [13-01-2023(online)].pdf | 2023-01-13 |
| 20 | 202211009221-US(14)-HearingNotice-(HearingDate-22-01-2025).pdf | 2024-12-19 |
| 20 | 202211009221-STATEMENT OF UNDERTAKING (FORM 3) [21-02-2022(online)].pdf | 2022-02-21 |
| 20 | 202211009221-FORM-9 [21-02-2022(online)].pdf | 2022-02-21 |
| 21 | 202211009221-POWER OF AUTHORITY [21-02-2022(online)].pdf | 2022-02-21 |
| 21 | 202211009221-FORM-26 [17-01-2025(online)].pdf | 2025-01-17 |
| 22 | 202211009221-PROOF OF RIGHT [21-02-2022(online)].pdf | 2022-02-21 |
| 22 | 202211009221-Correspondence to notify the Controller [17-01-2025(online)].pdf | 2025-01-17 |
| 23 | 202211009221-REQUEST FOR EARLY PUBLICATION(FORM-9) [21-02-2022(online)].pdf | 2022-02-21 |
| 23 | 202211009221-Written submissions and relevant documents [06-02-2025(online)].pdf | 2025-02-06 |
| 24 | 202211009221-PatentCertificate20-02-2025.pdf | 2025-02-20 |
| 24 | 202211009221-REQUEST FOR EXAMINATION (FORM-18) [21-02-2022(online)].pdf | 2022-02-21 |
| 25 | 202211009221-IntimationOfGrant20-02-2025.pdf | 2025-02-20 |
| 25 | 202211009221-STATEMENT OF UNDERTAKING (FORM 3) [21-02-2022(online)].pdf | 2022-02-21 |
| 1 | SearchHistoryE_13-07-2022.pdf |