Search results for: Forensics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29

Search results for: Forensics

29 Digital Image Forensics: Discovering the History of Digital Images

Authors: Gurinder Singh, Kulbir Singh

Abstract:

Digital multimedia contents such as image, video, and audio can be tampered easily due to the availability of powerful editing softwares. Multimedia forensics is devoted to analyze these contents by using various digital forensic techniques in order to validate their authenticity. Digital image forensics is dedicated to investigate the reliability of digital images by analyzing the integrity of data and by reconstructing the historical information of an image related to its acquisition phase. In this paper, a survey is carried out on the forgery detection by considering the most recent and promising digital image forensic techniques.

Keywords: Computer forensics, multimedia forensics, image ballistics, camera source identification, forgery detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
28 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan

Abstract:

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Keywords: Cloud forensics, data protection laws, GDPR, IoT forensics, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1099
27 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics

Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood

Abstract:

We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.

Keywords: Cloud computing, cybersecurity, digital forensics, Kafka, Kubernetes, Spark.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
26 Digital Forensics for Electronic Commerce on the Web

Authors: Ryuya Uda

Abstract:

On existing online shopping on the web, SSL and password are usually used to achieve the secure trades. SSL shields communication from the third party who is not related with the trade, and indicates that the trader's web site is authenticated by one of the certification authority. Password certifies a customer as the same person who has visited the trader's web site before, and protects the customer's privacy such as what the customer has bought on the site. However, there is no forensics for the trades in those cased above. With existing methods, no one can prove what is ordered by customers, how many products are ordered and even whether customers have ordered or not. The reason is that the third party has to guess what were traded with logs that are held by traders and by customers. The logs can easily be created, deleted and forged since they are electronically stored. To enhance security with digital forensics for electronic commerce on the web, I indicate a secure method with cellular phones.

Keywords: Cellular Phone, Digital Forensics, ElectronicCommerce, Information Security

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838
25 Towards a Proof Acceptance by Overcoming Challenges in Collecting Digital Evidence

Authors: Lilian Noronha Nassif

Abstract:

Cybercrime investigation demands an appropriated evidence collection mechanism. If the investigator does not acquire digital proofs in a forensic sound, some important information can be lost, and judges can discard case evidence because the acquisition was inadequate. The correct digital forensic seizing involves preparation of professionals from fields of law, police, and computer science. This paper presents important challenges faced during evidence collection in different perspectives of places. The crime scene can be virtual or real, and technical obstacles and privacy concerns must be considered. All pointed challenges here highlight the precautions to be taken in the digital evidence collection and the suggested procedures contribute to the best practices in the digital forensics field.

Keywords: Digital evidence, digital forensic processes and procedures, mobile forensics, cloud forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
24 WormHex: A Volatile Memory Analysis Tool for Retrieval of Social Media Evidence

Authors: Norah Almubairik, Wadha Almattar, Amani Alqarni

Abstract:

Social media applications are increasingly being used in our everyday communications. These applications utilise end-to-end encryption mechanisms which make them suitable tools for criminals to exchange messages. These messages are preserved in the volatile memory until the device is restarted. Therefore, volatile forensics has become an important branch of digital forensics. In this study, the WormHex tool was developed to inspect the memory dump files for Windows and Mac based workstations. The tool supports digital investigators by enabling them to extract valuable data written in Arabic and English through web-based WhatsApp and Twitter applications. The results confirm that social media applications write their data into the memory, regardless of the operating system running the application, with there being no major differences between Windows and Mac.

Keywords: Volatile memory, REGEX, digital forensics, memory acquisition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 922
23 A Method to Enhance the Accuracy of Digital Forensic in the Absence of Sufficient Evidence in Saudi Arabia

Authors: Fahad Alanazi, Andrew Jones

Abstract:

Digital forensics seeks to achieve the successful investigation of digital crimes through obtaining acceptable evidence from digital devices that can be presented in a court of law. Thus, the digital forensics investigation is normally performed through a number of phases in order to achieve the required level of accuracy in the investigation processes. Since 1984 there have been a number of models and frameworks developed to support the digital investigation processes. In this paper, we review a number of the investigation processes that have been produced throughout the years and introduce a proposed digital forensic model which is based on the scope of the Saudi Arabia investigation process. The proposed model has been integrated with existing models for the investigation processes and produced a new phase to deal with a situation where there is initially insufficient evidence.

Keywords: Digital forensics, Process, Metadata, Traceback, Saudi Arabia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1999
22 An Analysis of Digital Forensic Laboratory Development among Malaysia’s Law Enforcement Agencies

Authors: Sarah K. Taylor, Miratun M. Saharuddin, Zabri A. Talib

Abstract:

Cybercrime is on the rise, and yet many Law Enforcement Agencies (LEAs) in Malaysia have no Digital Forensics Laboratory (DFL) to assist them in the attrition and analysis of digital evidence. From the estimated number of 30 LEAs in Malaysia, sadly, only eight of them owned a DFL. All of the DFLs are concentrated in the capital of Malaysia and none at the state level. LEAs are still depending on the national DFL (CyberSecurity Malaysia) even for simple and straightforward cases. A survey was conducted among LEAs in Malaysia owning a DFL to understand their history of establishing the DFL, the challenges that they faced and the significance of the DFL to their case investigation. The results showed that the while some LEAs faced no challenge in establishing a DFL, some of them took seven to 10 years to do so. The reason was due to the difficulty in convincing their management because of the high costs involved. The results also revealed that with the establishment of a DFL, LEAs were better able to get faster forensic result and to meet agency’s timeline expectation. It is also found that LEAs were also able to get more meaningful forensic results on cases that require niche expertise, compared to sending off cases to the national DFL. Other than that, cases are getting more complex, and hence, a continuous stream of budget for equipment and training is inevitable. The result derived from the study is hoped to be used by other LEAs in justifying to their management the benefits of establishing an in-house DFL.

Keywords: Digital forensics, digital forensics laboratory, digital evidence, law enforcement agency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495
21 Three Tier Indoor Localization System for Digital Forensics

Authors: Dennis L. Owuor, Okuthe P. Kogeda, Johnson I. Agbinya

Abstract:

Mobile localization has attracted a great deal of attention recently due to the introduction of wireless networks. Although several localization algorithms and systems have been implemented and discussed in the literature, very few researchers have exploited the gap that exists between indoor localization, tracking, external storage of location information and outdoor localization for the purpose of digital forensics during and after a disaster. The contribution of this paper lies in the implementation of a robust system that is capable of locating, tracking mobile device users and store location information for both indoor and partially outdoor the cloud. The system can be used during disaster to track and locate mobile phone users. The developed system is a mobile application built based on Android, Hypertext Preprocessor (PHP), Cascading Style Sheets (CSS), JavaScript and MATLAB for the Android mobile users. Using Waterfall model of software development, we have implemented a three level system that is able to track, locate and store mobile device information in secure database (cloud) on almost a real time basis. The outcome of the study showed that the developed system is efficient with regard to the tracking and locating mobile devices. The system is also flexible, i.e. can be used in any building with fewer adjustments. Finally, the system is accurate for both indoor and outdoor in terms of locating and tracking mobile devices.

Keywords: Indoor localization, waterfall, digital forensics, tracking and cloud.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 942
20 Data Acquisition from Cell Phone using Logical Approach

Authors: Keonwoo Kim, Dowon Hong, Kyoil Chung, Jae-Cheol Ryou

Abstract:

Cell phone forensics to acquire and analyze data in the cellular phone is nowadays being used in a national investigation organization and a private company. In order to collect cellular phone flash memory data, we have two methods. Firstly, it is a logical method which acquires files and directories from the file system of the cell phone flash memory. Secondly, we can get all data from bit-by-bit copy of entire physical memory using a low level access method. In this paper, we describe a forensic tool to acquire cell phone flash memory data using a logical level approach. By our tool, we can get EFS file system and peek memory data with an arbitrary region from Korea CDMA cell phone.

Keywords: Forensics, logical method, acquisition, cell phone, flash memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4120
19 The Use of Ontology Framework for Automation Digital Forensics Investigation

Authors: Ahmad Luthfi

Abstract:

One of the main goals of a computer forensic analyst is to determine the cause and effect of the acquisition of a digital evidence in order to obtain relevant information on the case is being handled. In order to get fast and accurate results, this paper will discuss the approach known as Ontology Framework. This model uses a structured hierarchy of layers that create connectivity between the variant and searching investigation of activity that a computer forensic analysis activities can be carried out automatically. There are two main layers are used, namely Analysis Tools and Operating System. By using the concept of Ontology, the second layer is automatically designed to help investigator to perform the acquisition of digital evidence. The methodology of automation approach of this research is by utilizing Forward Chaining where the system will perform a search against investigative steps and atomically structured in accordance with the rules of the Ontology.

Keywords: Ontology, Framework, Automation, Forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2753
18 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: Block matching, digital evidence, hash list.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1358
17 Hybrid Approach for Memory Analysis in Windows System

Authors: Khairul Akram Zainol Ariffin, Ahmad Kamil Mahmood, Jafreezal Jaafar, Solahuddin Shamsuddin

Abstract:

Random Access Memory (RAM) is an important device in computer system. It can represent the snapshot on how the computer has been used by the user. With the growth of its importance, the computer memory has been an issue that has been discussed in digital forensics. A number of tools have been developed to retrieve the information from the memory. However, most of the tools have their limitation in the ability of retrieving the important information from the computer memory. Hence, this paper is aimed to discuss the limitation and the setback for two main techniques such as process signature search and process enumeration. Then, a new hybrid approach will be presented to minimize the setback in both individual techniques. This new approach combines both techniques with the purpose to retrieve the information from the process block and other objects in the computer memory. Nevertheless, the basic theory in address translation for x86 platforms will be demonstrated in this paper.

Keywords: Algorithms, Digital Forensics, Memory Analysis, Signature Search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988
16 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: Artificial intelligence, computer science, criminal investigation, digital forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1291
15 Facilitating a Cyber-Enabled Fraud Using the O.MG Cable to Incriminate the Victim

Authors: Damola O. Lawal, David W. Gresty, Diane E. Gan, Louise Hewitt

Abstract:

This paper investigates the feasibility of using a programmable USB such as the O.MG Cable to perform a file tampering attack. Here, the O.MG Cable, an apparently harmless mobile device charger is used in an unauthorised way, to alter the content of a file (an accounts record-January_Contributions.xlsx). The aim is to determine if a forensics analyst can reliably determine who has altered the target file; the O.MG Cable or the user of the machine. This work highlights some of the traces of the O.MG Cable left behind on the target computer itself such as the Product ID (PID) and Vendor ID (ID). Also discussed is the O.MG Cable’s behaviour during the experiments. We determine if a forensics analyst could identify if any evidence has been left behind by the programmable device on the target file once it has been removed from the computer to establish if the analyst would be able to link the traces left by the O.MG Cable to the file tampering. It was discovered that the forensic analyst might mistake the actions of the O.MG Cable for the computer users. Experiments carried out in this work could further the discussion as to whether an innocent user could be punished for the unauthorised changes made by a programmable device.

Keywords: O.MG Cable, programmable USB, file tampering attack, digital evidence credibility, miscarriage of justice, cyber fraud.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 667
14 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility

Authors: Alejandro Villegas, Cihan Varol

Abstract:

Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.

Keywords: Betamax, digital forensics, report utility, VoIP, VoIP Buster, VoIPWise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3123
13 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: Logistic Regression LoR, Kernel Density Estimator KDE, Handwriting, Confidence Interval, Repeatability, Reproducibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 471
12 Inefficiency of Data Storing in Physical Memory

Authors: Kamaruddin Malik Mohamad, Sapiee Haji Jamel, Mustafa Mat Deris

Abstract:

Memory forensic is important in digital investigation. The forensic is based on the data stored in physical memory that involve memory management and processing time. However, the current forensic tools do not consider the efficiency in terms of storage management and the processing time. This paper shows the high redundancy of data found in the physical memory that cause inefficiency in processing time and memory management. The experiment is done using Borland C compiler on Windows XP with 512 MB of physical memory.

Keywords: Digital Evidence, Memory Forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
11 Video Data Mining based on Information Fusion for Tamper Detection

Authors: Girija Chetty, Renuka Biswas

Abstract:

In this paper, we propose novel algorithmic models based on information fusion and feature transformation in crossmodal subspace for different types of residue features extracted from several intra-frame and inter-frame pixel sub-blocks in video sequences for detecting digital video tampering or forgery. An evaluation of proposed residue features – the noise residue features and the quantization features, their transformation in cross-modal subspace, and their multimodal fusion, for emulated copy-move tamper scenario shows a significant improvement in tamper detection accuracy as compared to single mode features without transformation in cross-modal subspace.

Keywords: image tamper detection, digital forensics, correlation features image fusion

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
10 Identity Verification Using k-NN Classifiers and Autistic Genetic Data

Authors: Fuad M. Alkoot

Abstract:

DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN). 

Keywords: Biometrics, identity verification, genetic data, k-nearest neighbor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1120
9 Classification of Computer Generated Images from Photographic Images Using Convolutional Neural Networks

Authors: Chaitanya Chawla, Divya Panwar, Gurneesh Singh Anand, M. P. S Bhatia

Abstract:

This paper presents a deep-learning mechanism for classifying computer generated images and photographic images. The proposed method accounts for a convolutional layer capable of automatically learning correlation between neighbouring pixels. In the current form, Convolutional Neural Network (CNN) will learn features based on an image's content instead of the structural features of the image. The layer is particularly designed to subdue an image's content and robustly learn the sensor pattern noise features (usually inherited from image processing in a camera) as well as the statistical properties of images. The paper was assessed on latest natural and computer generated images, and it was concluded that it performs better than the current state of the art methods.

Keywords: Image forensics, computer graphics, classification, deep learning, convolutional neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1175
8 Smartphone Video Source Identification Based on Sensor Pattern Noise

Authors: Raquel Ramos López, Anissa El-Khattabi, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

An increasing number of mobile devices with integrated cameras has meant that most digital video comes from these devices. These digital videos can be made anytime, anywhere and for different purposes. They can also be shared on the Internet in a short period of time and may sometimes contain recordings of illegal acts. The need to reliably trace the origin becomes evident when these videos are used for forensic purposes. This work proposes an algorithm to identify the brand and model of mobile device which generated the video. Its procedure is as follows: after obtaining the relevant video information, a classification algorithm based on sensor noise and Wavelet Transform performs the aforementioned identification process. We also present experimental results that support the validity of the techniques used and show promising results.

Keywords: Digital video, forensics analysis, key frame, mobile device, PRNU, sensor noise, source identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1197
7 A Real-Time Image Change Detection System

Authors: Madina Hamiane, Amina Khunji

Abstract:

Detecting changes in multiple images of the same scene has recently seen increased interest due to the many contemporary applications including smart security systems, smart homes, remote sensing, surveillance, medical diagnosis, weather forecasting, speed and distance measurement, post-disaster forensics and much more. These applications differ in the scale, nature, and speed of change. This paper presents an application of image processing techniques to implement a real-time change detection system. Change is identified by comparing the RGB representation of two consecutive frames captured in real-time. The detection threshold can be controlled to account for various luminance levels. The comparison result is passed through a filter before decision making to reduce false positives, especially at lower luminance conditions. The system is implemented with a MATLAB Graphical User interface with several controls to manage its operation and performance.

Keywords: Image change detection, Image processing, image filtering, thresholding, B/W quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2563
6 High Speed Bitwise Search for Digital Forensic System

Authors: Hyungkeun Jee, Jooyoung Lee, Dowon Hong

Abstract:

The most common forensic activity is searching a hard disk for string of data. Nowadays, investigators and analysts are increasingly experiencing large, even terabyte sized data sets when conducting digital investigations. Therefore consecutive searching can take weeks to complete successfully. There are two primary search methods: index-based search and bitwise search. Index-based searching is very fast after the initial indexing but initial indexing takes a long time. In this paper, we discuss a high speed bitwise search model for large-scale digital forensic investigations. We used pattern matching board, which is generally used for network security, to search for string and complex regular expressions. Our results indicate that in many cases, the use of pattern matching board can substantially increase the performance of digital forensic search tools.

Keywords: Digital forensics, search, regular expression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
5 Efficient Copy-Move Forgery Detection for Digital Images

Authors: Somayeh Sadeghi, Hamid A. Jalab, Sajjad Dadkhah

Abstract:

Due to availability of powerful image processing software and improvement of human computer knowledge, it becomes easy to tamper images. Manipulation of digital images in different fields like court of law and medical imaging create a serious problem nowadays. Copy-move forgery is one of the most common types of forgery which copies some part of the image and pastes it to another part of the same image to cover an important scene. In this paper, a copy-move forgery detection method proposed based on Fourier transform to detect forgeries. Firstly, image is divided to same size blocks and Fourier transform is performed on each block. Similarity in the Fourier transform between different blocks provides an indication of the copy-move operation. The experimental results prove that the proposed method works on reasonable time and works well for gray scale and colour images. Computational complexity reduced by using Fourier transform in this method.

Keywords: Copy-Move forgery, Digital Forensics, Image Forgery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2785
4 Fingerprint Compression Using Multiwavelets

Authors: Sudhakar.R, Jayaraman.S

Abstract:

Large volumes of fingerprints are collected and stored every day in a wide range of applications, including forensics, access control etc. It is evident from the database of Federal Bureau of Investigation (FBI) which contains more than 70 million finger prints. Compression of this database is very important because of this high Volume. The performance of existing image coding standards generally degrades at low bit-rates because of the underlying block based Discrete Cosine Transform (DCT) scheme. Over the past decade, the success of wavelets in solving many different problems has contributed to its unprecedented popularity. Due to implementation constraints scalar wavelets do not posses all the properties which are needed for better performance in compression. New class of wavelets called 'Multiwavelets' which posses more than one scaling filters overcomes this problem. The objective of this paper is to develop an efficient compression scheme and to obtain better quality and higher compression ratio through multiwavelet transform and embedded coding of multiwavelet coefficients through Set Partitioning In Hierarchical Trees algorithm (SPIHT) algorithm. A comparison of the best known multiwavelets is made to the best known scalar wavelets. Both quantitative and qualitative measures of performance are examined for Fingerprints.

Keywords: Mutiwavelet, Modified SPIHT Algorithm, SPIHT, Wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
3 Searching for Forensic Evidence in a Compromised Virtual Web Server against SQL Injection Attacks and PHP Web Shell

Authors: Gigih Supriyatno

Abstract:

SQL injection is one of the most common types of attacks and has a very critical impact on web servers. In the worst case, an attacker can perform post-exploitation after a successful SQL injection attack. In the case of forensics web servers, web server analysis is closely related to log file analysis. But sometimes large file sizes and different log types make it difficult for investigators to look for traces of attackers on the server. The purpose of this paper is to help investigator take appropriate steps to investigate when the web server gets attacked. We use attack scenarios using SQL injection attacks including PHP backdoor injection as post-exploitation. We perform post-mortem analysis of web server logs based on Hypertext Transfer Protocol (HTTP) POST and HTTP GET method approaches that are characteristic of SQL injection attacks. In addition, we also propose structured analysis method between the web server application log file, database application, and other additional logs that exist on the webserver. This method makes the investigator more structured to analyze the log file so as to produce evidence of attack with acceptable time. There is also the possibility that other attack techniques can be detected with this method. On the other side, it can help web administrators to prepare their systems for the forensic readiness.

Keywords: Web forensic, SQL injection, web shell, investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1257
2 Detecting Email Forgery using Random Forests and Naïve Bayes Classifiers

Authors: Emad E Abdallah, A.F. Otoom, ArwaSaqer, Ola Abu-Aisheh, Diana Omari, Ghadeer Salem

Abstract:

As emails communications have no consistent authentication procedure to ensure the authenticity, we present an investigation analysis approach for detecting forged emails based on Random Forests and Naïve Bays classifiers. Instead of investigating the email headers, we use the body content to extract a unique writing style for all the possible suspects. Our approach consists of four main steps: (1) The cybercrime investigator extract different effective features including structural, lexical, linguistic, and syntactic evidence from previous emails for all the possible suspects, (2) The extracted features vectors are normalized to increase the accuracy rate. (3) The normalized features are then used to train the learning engine, (4) upon receiving the anonymous email (M); we apply the feature extraction process to produce a feature vector. Finally, using the machine learning classifiers the email is assigned to one of the suspects- whose writing style closely matches M. Experimental results on real data sets show the improved performance of the proposed method and the ability of identifying the authors with a very limited number of features.

Keywords: Digital investigation, cybercrimes, emails forensics, anonymous emails, writing style, and authorship analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5254
1 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: Database, forensic genetics, genetic analysis, sample management, software solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168