Search results for: forensic autopsy data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25103

Search results for: forensic autopsy data

24893 Design and Development of an Application for the Evaluation of Personal Injury and Disability in Occupational and Forensic Medicine

Authors: Daniel Suárez, Jesús Tomas, Sandra Sendra, Sandra Viciano-Tudela, Luis Felipe Calle, Javier Urios, Jaime Lloret

Abstract:

Our study is to develop a tool for the mobile phone to an assessment of body damage or determination of the degree of disability. This is a field of action of legal medicine and insurance with obvious economic implications. Those people who have suffered an accident or bodily harm demand a quantification of it. The assessment of bodily harm or disability by the expert medical professional is not exempt from complexity. Sometimes it is difficult to quantify pain; other times, the doctor faces simulators or exaggerators, and on many occasions, it is difficult to remember the extensive tables of scales whose details are complex to remember and apply. We present a tool, as a mobile application, that allows entering the sociodemographic date of the patient as well as the characteristics of the accident suffered by the person. With these preliminary data and introducing bodily damage, an approximate calculation of the compensation that the injured party should receive can be made. One of the results of this study is that it allows calculating joint mobility angles without the need to use a goniometer.

Keywords: mobile tool, body damage, personal injury and disability, telemedicine

Procedia PDF Downloads 85
24892 Evaluation of the Relations between Childhood Trauma and Dissociative Experiences, Self-Perception, and Early Maladaptive Schemes in Sexual Assault Convicts

Authors: Safak Akdemir

Abstract:

The main purpose of this research is to evaluate the relationships between childhood traumas and dissociative experiences, self-perceptions and early maladaptive schemas in male convicts convicted of sexual assault crimes in prison. In our study, male convicts in prison for the crime of sexual assault constitute the experimental group, and the participants matched with this experimental group in terms of education, age and gender constitute the control group. The experimental group of the research consists of 189 male individuals who are convicted in the Ministry of Justice, General Directorate of Prisons, Istanbul/Maltepe L Type Closed Prison. The control group of this study consists of 147 adult males matched with the experimental group in terms of age, gender and education parameters. A total of 336 adult male individuals are included in the sample of this study. 46% of the experimental group were convicted of only sexual assault, 54% of them were convicted of both sexual assault and murder, injury and drug crimes. Total of five data collection tools, namely the Personal Information Form created by S. A. & E. O., Childhood Trauma Questionnaire (CTQ), the Dissociative Experiences Scale (DES), the Rosenberg Self-Esteem Scale (RSES), and the Young Schema Questionnaire-Short Form (YSQ-SF3), were completed. DES cut-off score of 99 (52.39%) of 189 convicts in the experimental group and 12 (8.17%) of 147 people in the control group was found to be 30 and above, and this result indicates the presence of pathological dissociative experiences. 180 (95.23%) of the sexual assault convicts in the experimental group had at least one childhood trauma, 154 (81.48%) were emotional neglect, 140 (74.07%) were emotional abuse, 121 (64.02%) were physical neglect, 91 (4814%) physical abuse and 70 (37.03%) sexual abuse. 168 (88.88%) of the experimental group reported multiple type of trauma and 12 (6.34%) reported single type of trauma. While the childhood traumas, isolation, abandonment and emotional deprivation schema levels of the convicts with a DES cut-off score of 30 and above are higher than the convicts with a DES cut-off score of 30 and above, their self-esteem is lower than this group. Experimental group while childhood traumas, dissociative experiences and early maladaptive schemas are higher than the control group, their self-esteem levels are lower. Dissociative experiences, abandonment and emotional deprivation early maladaptive schemas are more common in convicts aged between 18-30 years compared to convicts aged 31 and over. In addition, dissociative experiences and early maladaptive schemas of male convicts who reported physical and sexual abuse were higher than those who did not report physical and sexual abuse, while their self-esteem was at a lower level. As a result, in terms of psychotraumatology and clinical forensic psychology, dissociative disorders developed under the influence of chronic childhood traumas, with clinical interviews and psychometric measurements to be made in terms of forensic psychiatry; it is of fundamental importance to evaluate it in terms of neurosis-psychosis distinction, disability retirement, custody, malpractice, criminal and legal capacity criteria.

Keywords: crime, sexual assault, criminology, rape crimes, dissocitative disorders, maladative schemas

Procedia PDF Downloads 67
24891 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 398
24890 How to Use Big Data in Logistics Issues

Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy

Abstract:

Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.

Keywords: big data, logistics, operational efficiency, risk management

Procedia PDF Downloads 638
24889 Modelling for Roof Failure Analysis in an Underground Cave

Authors: M. Belén Prendes-Gero, Celestino González-Nicieza, M. Inmaculada Alvarez-Fernández

Abstract:

Roof collapse is one of the problems with a higher frequency in most of the mines of all countries, even now. There are many reasons that may cause the roof to collapse, namely the mine stress activities in the mining process, the lack of vigilance and carelessness or the complexity of the geological structure and irregular operations. This work is the result of the analysis of one accident produced in the “Mary” coal exploitation located in northern Spain. In this accident, the roof of a crossroad of excavated galleries to exploit the “Morena” Layer, 700 m deep, collapsed. In the paper, the work done by the forensic team to determine the causes of the incident, its conclusions and recommendations are collected. Initially, the available documentation (geology, geotechnics, mining, etc.) and accident area were reviewed. After that, laboratory and on-site tests were carried out to characterize the behaviour of the rock materials and the support used (metal frames and shotcrete). With this information, different hypotheses of failure were simulated to find the one that best fits reality. For this work, the software of finite differences in three dimensions, FLAC 3D, was employed. The results of the study confirmed that the detachment was originated as a consequence of one sliding in the layer wall, due to the large roof span present in the place of the accident, and probably triggered as a consequence of the existence of a protection pillar insufficient. The results allowed to establish some corrective measures avoiding future risks. For example, the dimensions of the protection zones that must be remained unexploited and their interaction with the crossing areas between galleries, or the use of more adequate supports for these conditions, in which the significant deformations may discourage the use of rigid supports such as shotcrete. At last, a grid of seismic control was proposed as a predictive system. Its efficiency was tested along the investigation period employing three control equipment that detected new incidents (although smaller) in other similar areas of the mine. These new incidents show that the use of explosives produces vibrations which are a new risk factor to analyse in a next future.

Keywords: forensic analysis, hypothesis modelling, roof failure, seismic monitoring

Procedia PDF Downloads 111
24888 Effects of Post-sampling Conditions on Ethanol and Ethyl Glucuronide Formation in the Urine of Diabetes Patients

Authors: Hussam Ashwi, Magbool Oraiby, Ali Muyidi, Hamad Al-Oufi, Mohammed Al-Oufi, Adel Al-Juhani, Salman Al-Zemaa, Saeed Al-Shahrani, Amal Abuallah, Wedad Sherwani, Mohammed Alattas, Ibraheem Attafi

Abstract:

Ethanol must be accurately identified and quantified to establish their use and contribution in criminal cases and forensic medicine. In some situations, it may be necessary to reanalyze an old specimen; therefore, it is essential to comprehend the effect of storage conditions and how long the result of a reanalyzed specimen can be reliable and reproducible. Additionally, ethanol can be produced via multiple in vivo and in vitro processes, particularly in diabetic patients, and the results can be affected by storage conditions and time. In order to distinguish between in vivo and in vitro alcohol generation in diabetes patient urine samples, various factors should be considered. This study identifies and quantifies ethanol and EtG in diabetic patients' urine samples stored in two different settings over time. Ethanol levels were determined using gas chromatography-headspace (GC-HS), and ethyl glucuronide (EtG) levels were determined using the immunoassay (RANDOX) technique. Ten urine specimens were collected and placed in a standard container. Each specimen was separated into two containers. The specimens were divided into two groups: those kept at room temperature (25 °C) and those kept cold (2-8 °C). Ethanol and EtG levels were determined serially over a two-week period. Initial results showed that none of the specimens tested positive for ethanol or EtG. At room temperature (15-25 °C), 7 and 14 days after the sample was taken, the average concentration of ethanol increased from 1.7 mg/dL to 2 mg/dL, and the average concentration of EtG increased from 108 ng/mL to 186 ng/mL. At 2–8 °C, the average ethanol concentration was 0.4 and 0.5 mg/dL, and the average EtG concentration was 138 and 124 ng/mL seven and fourteen days after the sample was collected, respectively. When ethanol and EtG levels were determined 14 days post collection, they were considerably lower than when stored at room temperature. A considerable increase in EtG concentrations (14-day range 0–186 ng/mL) is produced during room-temperature storage, although negative initial results for all specimens. Because EtG might be produced after a sampling collection, it is not a reliable indicator of recent alcohol consumption. Given the possibility of misleading EtG results due to in vitro EtG production in the urine of diabetic patients.

Keywords: ethyl glucuronide, ethanol, forensic toxicology, diabetic

Procedia PDF Downloads 121
24887 Effect of Tissue Preservation Chemicals on Decomposition in Different Soil Types

Authors: Onyekachi Ogbonnaya Iroanya, Taiye Abdullahi Gegele, Frank Tochukwu Egwuatu

Abstract:

Introduction: Forensic taphonomy is a multifaceted area that incorporates decomposition, chemical and biological cadaver exposure in post-mortem event chronology and reconstruction to predict the Post Mortem Interval (PMI). The aim of this study was to evaluate the integrity of DNA extracted from the remains of embalmed decomposed Sus domesticus tissues buried in different soil types. Method: A total of 12 limbs of Sus domesticus weighing between 0.7-1.4 kg were used. Each of the samples across the groups was treated with 10% formaldehyde, absolute methanol and 50% Pine oil for 24 hours before burial except the control samples, which were buried immediately. All samples were buried in shallow simulated Clay, Sandy and Loamy soil graves for 12 months. The DNA for each sample was extracted and quantified with Nanodrop Spectrophotometer (6305 JENWAY spectrometers). The rate of decomposition was examined through the modified qualitative decomposition analysis. Extracted DNA was amplified through PCR and bands visualized via gel electrophoresis. A biochemical enzyme assay was done for each burial grave soil. Result: The limbs in all burial groups had lost weight over the burial period. There was a significant increase in the soil urease level in the samples preserved in formaldehyde across the 3 soil type groups (p≤0.01). Also, the control grave soils recorded significantly higher alkaline phosphatase, dehydrogenase and calcium carbonate values compared to experimental grave soils (p≤0.01). The experimental samples showed a significant decrease in DNA concentration and purity when compared to the control groups (p≤0.01). Obtained findings of the soil biochemical analysis showed the embalming treatment altered the relationship between organic matter decomposition and soil biochemical properties as observed in the fluctuations that were recorded in the soil biochemical parameters. The PCR amplified DNA showed no bands on the gel electrophoresis plates. Conclusion: In criminal investigations, factors such as burial grave soil, grave soil biochemical properties, antemortem exposure to embalming chemicals should be considered in post-mortem interval (PMI) determination.

Keywords: forensic taphonomy, post-mortem interval (PMI), embalmment, decomposition, grave soil

Procedia PDF Downloads 162
24886 Environmental Forensic Analysis of the Shoreline Microplastics Debris on the Limbe Coastline, Cameroon

Authors: Ndumbe Eric Esongami, Manga Veronica Ebot, Foba Josepha Tendo, Yengong Fabrice Lamfu, Tiku David Tambe

Abstract:

The prevalence and unpleasant nature of plastics pollution constantly observed on beach shore on stormy events has prompt researchers worldwide to thesis on sustainable economic and environmental designs on plastics, especially in Cameroon, a major touristic destination in the Central Africa Region. The inconsistent protocols develop by researchers has added to this burden, thus the morphological nature of microplastic remediation is a call for concerns. The prime aim of the study is to morphologically identify, quantify and forensically understands the distribution of each plastics polymer composition. Duplicates of 2×2 m (4m2) quadrants were sampled in each beach/month over 8 months period across five purposive beaches along the Limbe – Idenau coastline, Cameroon. Collected plastic samples were thoroughly washed and separation done using a 2 mm sieve. Only particles of size, < 2 mm, were considered and forward follow the microplastics laboratory analytical processes. Established step by step methodological procedures of particle filtration, organic matter digestion, density separation, particle extraction and polymer identification including microscope and were applied for the beach microplastics samples. Microplastics were observed in each sample/beach/month with an overall abundance of 241 particles/number weighs 89.15 g in total and with a mean abundance of 2 particles/m2 (0.69 g/m2) and 6 particles/month (2.0 g/m2). The accumulation of beach shoreline MPs rose dramatically towards decreasing size with microbeads and fiber only found in the < 1 mm size fraction. Approximately 75% of beach MPs contamination were found in LDB 2, LDB 1 and IDN beaches/average particles/number while the most dominant polymer type frequently observed also were PP, PE, and PS in all morphologically parameters analysed. Beach MPs accumulation significantly varied temporally and spatially at p = 0.05. ANOVA and Spearman’s rank correlation used shows linear relationships between the sizes categories considered in this study. In terms of polymer MPs analysis, the colour class recorded that white coloured MPs was dominant, 50 particles/number (22.25 g) with recorded abundance/number in PP (25), PE (15) and PS (5). The shape class also revealed that irregularly shaped MPs was dominant, 98 particles/number (30.5 g) with higher abundance/number in PP (39), PE (33), and PS (11). Similarly, MPs type class shows that fragmented MPs type was also dominant, 80 particles/number (25.25 g) with higher abundance/number in PP (30), PE (28) and PS (15). Equally, the sized class forward revealed that 1.5 – 1.99 mm sized ranged MPs had the highest abundance of 102 particles/number (51.77 g) with higher concentration observed in PP (47), PE (41), and PS (7) as well and finally, the weight class also show that 0.01 g weighs MPs was dominated by 98 particles/number (56.57 g) with varied numeric abundance seen in PP (49), PE (29) and PS (13). The forensic investigation of the pollution indicated that majority of the beach microplastic is sourced from the site/nearby area. The investigation could draw useful conclusions regarding the pathways of pollution. The fragmented microplastic, a significant component in the sample, was found to be sourced from recreational activities and partly from fishing boat installations and repairs activities carried out close to the shore.

Keywords: forensic analysis, beach MPs, particle/number, polymer composition, cameroon

Procedia PDF Downloads 73
24885 Forensic Detection of Errors Permitted by the Witnesses in Their Testimony

Authors: Lev Bertovsky

Abstract:

The purpose of this study was to determine the reasons for the formation of false testimony from witnesses and make recommendations on the recognition of such cases. During the studies, which were based on the achievements of professionals in the field of psychology, as well as personal investigative practice, the stages of perception of the information were studied, as well as the process of its reclaim from the memory and transmission to the communicator upon request. Based on the principles of the human brain, kinds of conscientious witness mistakes were systematized. Proposals were formulated for the optimization of investigative actions in cases where the witnesses make an honest mistake with respect to the effects previously observed by them.

Keywords: criminology, eyewitness testimony, honest mistake, information, investigator, investigation, questioning

Procedia PDF Downloads 183
24884 Detection of Cyberattacks on the Metaverse Based on First-Order Logic

Authors: Sulaiman Al Amro

Abstract:

There are currently considerable challenges concerning data security and privacy, particularly in relation to modern technologies. This includes the virtual world known as the Metaverse, which consists of a virtual space that integrates various technologies and is therefore susceptible to cyber threats such as malware, phishing, and identity theft. This has led recent studies to propose the development of Metaverse forensic frameworks and the integration of advanced technologies, including machine learning for intrusion detection and security. In this context, the application of first-order logic offers a formal and systematic approach to defining the conditions of cyberattacks, thereby contributing to the development of effective detection mechanisms. In addition, formalizing the rules and patterns of cyber threats has the potential to enhance the overall security posture of the Metaverse and, thus, the integrity and safety of this virtual environment. The current paper focuses on the primary actions employed by avatars for potential attacks, including Interval Temporal Logic (ITL) and behavior-based detection to detect an avatar’s abnormal activities within the Metaverse. The research established that the proposed framework attained an accuracy of 92.307%, resulting in the experimental results demonstrating the efficacy of ITL, including its superior performance in addressing the threats posed by avatars within the Metaverse domain.

Keywords: security, privacy, metaverse, cyberattacks, detection, first-order logic

Procedia PDF Downloads 37
24883 Estimating the Ladder Angle and the Camera Position From a 2D Photograph Based on Applications of Projective Geometry and Matrix Analysis

Authors: Inigo Beckett

Abstract:

In forensic investigations, it is often the case that the most potentially useful recorded evidence derives from coincidental imagery, recorded immediately before or during an incident, and that during the incident (e.g. a ‘failure’ or fire event), the evidence is changed or destroyed. To an image analysis expert involved in photogrammetric analysis for Civil or Criminal Proceedings, traditional computer vision methods involving calibrated cameras is often not appropriate because image metadata cannot be relied upon. This paper presents an approach for resolving this problem, considering in particular and by way of a case study, the angle of a simple ladder shown in a photograph. The UK Health and Safety Executive (HSE) guidance document published in 2014 (INDG455) advises that a leaning ladder should be erected at 75 degrees to the horizontal axis. Personal injury cases can arise in the construction industry because a ladder is too steep or too shallow. Ad-hoc photographs of such ladders in their incident position provide a basis for analysis of their angle. This paper presents a direct approach for ascertaining the position of the camera and the angle of the ladder simultaneously from the photograph(s) by way of a workflow that encompasses a novel application of projective geometry and matrix analysis. Mathematical analysis shows that for a given pixel ratio of directly measured collinear points (i.e. features that lie on the same line segment) from the 2D digital photograph with respect to a given viewing point, we can constrain the 3D camera position to a surface of a sphere in the scene. Depending on what we know about the ladder, we can enforce another independent constraint on the possible camera positions which enables us to constrain the possible positions even further. Experiments were conducted using synthetic and real-world data. The synthetic data modeled a vertical plane with a ladder on a horizontally flat plane resting against a vertical wall. The real-world data was captured using an Apple iPhone 13 Pro and 3D laser scan survey data whereby a ladder was placed in a known location and angle to the vertical axis. For each case, we calculated camera positions and the ladder angles using this method and cross-compared them against their respective ‘true’ values.

Keywords: image analysis, projective geometry, homography, photogrammetry, ladders, Forensics, Mathematical modeling, planar geometry, matrix analysis, collinear, cameras, photographs

Procedia PDF Downloads 45
24882 Implementation of an IoT Sensor Data Collection and Analysis Library

Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee

Abstract:

Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.

Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data

Procedia PDF Downloads 372
24881 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 157
24880 Exploiting the Potential of Fabric Phase Sorptive Extraction for Forensic Food Safety: Analysis of Food Samples in Cases of Drug Facilitated Crimes

Authors: Bharti Jain, Rajeev Jain, Abuzar Kabir, Torki Zughaibi, Shweta Sharma

Abstract:

Drug-facilitated crimes (DFCs) entail the use of a single drug or a mixture of drugs to render a victim unable. Traditionally, biological samples have been gathered from victims and conducted analysis to establish evidence of drug administration. Nevertheless, the rapid metabolism of various drugs and delays in analysis can impede the identification of such substances. For this, the present article describes a rapid, sustainable, highly efficient and miniaturized protocol for the identification and quantification of three sedative-hypnotic drugs, namely diazepam, chlordiazepoxide and ketamine in alcoholic beverages and complex food samples (cream of biscuit, flavored milk, juice, cake, tea, sweets and chocolate). The methodology involves utilizing fabric phase sorptive extraction (FPSE) to extract diazepam (DZ), chlordiazepoxide (CDP), and ketamine (KET). Subsequently, the extracted samples are subjected to analysis using gas chromatography-mass spectrometry (GC-MS). Several parameters, including the type of membrane, pH, agitation time and speed, ionic strength, sample volume, elution volume and time, and type of elution solvent, were screened and thoroughly optimized. Sol-gel Carbowax 20M (CW-20M) has demonstrated the most effective extraction efficiency for the target analytes among all evaluated membranes. Under optimal conditions, the method displayed linearity within the range of 0.3–10 µg mL–¹ (or µg g–¹), exhibiting a coefficient of determination (R2) ranging from 0.996–0.999. The limits of detection (LODs) and limits of quantification (LOQs) for liquid samples range between 0.020-0.069 µg mL-¹ and 0.066-0.22 µg mL-¹, respectively. Correspondingly, the LODs for solid samples ranged from 0.056-0.090 µg g-¹, while the LOQs ranged from 0.18-0.29 µg g-¹. Notably, the method showcased better precision, with repeatability and reproducibility both below 5% and 10%, respectively. Furthermore, the FPSE-GC-MS method proved effective in determining diazepam (DZ) in forensic food samples connected to drug-facilitated crimes (DFCs). Additionally, the proposed method underwent evaluation for its whiteness using the RGB12 algorithm.

Keywords: drug facilitated crime, fabric phase sorptive extraction, food forensics, white analytical chemistry

Procedia PDF Downloads 63
24879 Diagnostic Value of Different Noninvasive Criteria of Latent Myocarditis in Comparison with Myocardial Biopsy

Authors: Olga Blagova, Yuliya Osipova, Evgeniya Kogan, Alexander Nedostup

Abstract:

Purpose: to quantify the value of various clinical, laboratory and instrumental signs in the diagnosis of myocarditis in comparison with morphological studies of the myocardium. Methods: in 100 patients (65 men, 44.7±12.5 years) with «idiopathic» arrhythmias (n = 20) and dilated cardiomyopathy (DCM, n = 80) were performed 71 endomyocardial biopsy (EMB), 13 intraoperative biopsy, 5 study of explanted hearts, 11 autopsy with virus investigation (real-time PCR) of the blood and myocardium. Anti-heart antibodies (AHA) were also measured as well as cardiac CT (n = 45), MRI (n = 25), coronary angiography (n = 47). The comparison group included of 50 patients (25 men, 53.7±11.7 years) with non-inflammatory heart diseases who underwent open heart surgery. Results. Active/borderline myocarditis was diagnosed in 76.0% of the study group and in 21.6% of patients of the comparison group (p < 0.001). The myocardial viral genome was observed more frequently in patients of comparison group than in study group (group (65.0% and 40.2%; p < 0.01. Evaluated the diagnostic value of noninvasive markers of myocarditis. The panel of anti-heart antibodies had the greatest importance to identify myocarditis: sensitivity was 81.5%, positive and negative predictive value was 75.0 and 60.5%. It is defined diagnostic value of non-invasive markers of myocarditis and diagnostic algorithm providing an individual assessment of the likelihood of myocarditis is developed. Conclusion. The greatest significance in the diagnosis of latent myocarditis in patients with 'idiopathic' arrhythmias and DCM have AHA. The use of complex of noninvasive criteria allows estimate the probability of myocarditis and determine the indications for EMB.

Keywords: myocarditis, "idiopathic" arrhythmias, dilated cardiomyopathy, endomyocardial biopsy, viral genome, anti-heart antibodies

Procedia PDF Downloads 172
24878 Government Big Data Ecosystem: A Systematic Literature Review

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.

Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review

Procedia PDF Downloads 224
24877 A Machine Learning Decision Support Framework for Industrial Engineering Purposes

Authors: Anli Du Preez, James Bekker

Abstract:

Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.

Keywords: Data analytics, Industrial engineering, Machine learning, Value creation

Procedia PDF Downloads 165
24876 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification

Authors: Meimei Shi

Abstract:

Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.

Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus

Procedia PDF Downloads 134
24875 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm

Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima

Abstract:

In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.

Keywords: cloud space, AES, FTP, NetBeans IDE

Procedia PDF Downloads 203
24874 Burden of Severe COVID-19 in Center of Iran: Results of Disability-Adjusted Life Years (DALYs)

Authors: Moslem Taheri Soodejani, Mohammad Hassan Lotfi

Abstract:

Introduction: The outbreak of Covid-19 disease is an international public health concern. Therefore, the analysis of information related to mortality and disability due to COVID-19 is considered important, so the present study was designed and conducted with the aim of assessing COVID-19 Disability-Adjusted Life Years (DALYs) in Yazd. Methods: In Yazd province, all suspected cases of Covid-19 that would be referred to central hospitals in order to get confirmed through PCR or CT scan tests were recruited to our study. The fatality data of Covid- 19 was gathered from the forensic medicine organization. The Disability-Adjusted Life Years (DALYs) combines in one measure years of life lost (YLL), the loss of healthy life due to premature mortality and years of life lived with disability (YLD), the loss of healthy life because of disease and disability. Results: The total burden of COVID-19 was 23,472 years. The number of years lost due to premature death was 23385 and the number of years of life with disability due to COVID-19 was estimated to be 87 years. The disease burden was 12992 years for men and 10480 years for women. The overall incidence of COVID-19 was 1411 per 100,000, of which 1419 in men and 1402 in women per 100,000. Conclusion: The outbreak of the COVID-19 pandemic affected a large population and the residents of Yazd Province lost many years of their lives due to this disease.

Keywords: DALY, covid- 19, Yazd, Iran

Procedia PDF Downloads 186
24873 Business Intelligence for Profiling of Telecommunication Customer

Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro

Abstract:

Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.

Keywords: business intelligence, customer segmentation, data warehouse, data mining

Procedia PDF Downloads 477
24872 Succession and Rural vs. Urban Habitat Differences of Coleoptera Species Attracted to Pig Carrions in Eskişehir Province, Turkey

Authors: Cansu Kılıç, Ferhat Altunsoy

Abstract:

In this study, a total of 82 species belonging to the families Staphylinidae, Histeridae, Dermestidae, Silphidae and Cleridae within Coleptera were detected which are collected from 24 pig carrion for a duration of one year. While 12 of the carrions have been placed in rural areas, other 12 have been placed in urban areas in Eskişehir province. The distribution of these species according to months and the period that they exist on different stages of decomposition were determined. Furthermore, Coleoptera species attracted to the pig carrions both in rural and urban areas were detected and their similarities and differences were presented.

Keywords: forensic entomology, Coleoptera, succession, Turkey, rural, urban

Procedia PDF Downloads 307
24871 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 569
24870 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework

Authors: Lutful Karim, Mohammed S. Al-kahtani

Abstract:

Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.

Keywords: big data, clustering, tree topology, data aggregation, sensor networks

Procedia PDF Downloads 338
24869 Intersubjectivity of Forensic Handwriting Analysis

Authors: Marta Nawrocka

Abstract:

In each of the legal proceedings, in which expert evidence is carried out, a major concern is the assessment of the evidential value of expert reports. Judicial institutions, while making decisions, rely heavily on the expert reports, because they usually do not possess 'special knowledge' from a certain fields of science which makes it impossible for them to verify the results presented in the processes. In handwriting studies, the standards of analysis are developed. They unify procedures used by experts in comparing signs and in constructing expert reports. However, the methods used by experts are usually of a qualitative nature. They rely on the application of knowledge and experience of expert and in effect give significant range of margin in the assessment. Moreover, the standards used by experts are still not very precise and the process of reaching the conclusions is poorly understood. The above-mentioned circumstances indicate that expert opinions in the field of handwriting analysis, for many reasons, may not be sufficiently reliable. It is assumed that this state of affairs has its source in a very low level of intersubjectivity of measuring scales and analysis procedures, which consist elements of this kind of analysis. Intersubjectivity is a feature of cognition which (in relation to methods) indicates the degree of consistency of results that different people receive using the same method. The higher the level of intersubjectivity is, the more reliable and credible the method can be considered. The aim of the conducted research was to determine the degree of intersubjectivity of the methods used by the experts from the scope of handwriting analysis. 30 experts took part in the study and each of them received two signatures, with varying degrees of readability, for analysis. Their task was to distinguish graphic characteristics in the signature, estimate the evidential value of the found characteristics and estimate the evidential value of the signature. The obtained results were compared with each other using the Alpha Krippendorff’s statistic, which numerically determines the degree of compatibility of the results (assessments) that different people receive under the same conditions using the same method. The estimation of the degree of compatibility of the experts' results for each of these tasks allowed to determine the degree of intersubjectivity of the studied method. The study showed that during the analysis, the experts identified different signature characteristics and attributed different evidential value to them. In this scope, intersubjectivity turned out to be low. In addition, it turned out that experts in various ways called and described the same characteristics, and the language used was often inconsistent and imprecise. Thus, significant differences have been noted on the basis of language and applied nomenclature. On the other hand, experts attributed a similar evidential value to the entire signature (set of characteristics), which indicates that in this range, they were relatively consistent.

Keywords: forensic sciences experts, handwriting analysis, inter-rater reliability, reliability of methods

Procedia PDF Downloads 148
24868 The Effect of Experimentally Induced Stress on Facial Recognition Ability of Security Personnel’s

Authors: Zunjarrao Kadam, Vikas Minchekar

Abstract:

The facial recognition is an important task in criminal investigation procedure. The security guards-constantly watching the persons-can help to identify the suspected accused. The forensic psychologists are tackled such cases in the criminal justice system. The security personnel may loss their ability to correctly identify the persons due to constant stress while performing the duty. The present study aimed at to identify the effect of experimentally induced stress on facial recognition ability of security personnel’s. For this study 50, security guards from Sangli, Miraj & Jaysingpur city of the Maharashtra States of India were recruited in the experimental study. The randomized two group design was employed to carry out the research. In the initial condition twenty identity card size photographs were shown to both groups. Afterward, artificial stress was induced in the experimental group through the difficultpuzzle-solvingtask in a limited period. In the second condition, both groups were presented earlier photographs with another additional thirty new photographs. The subjects were asked to recognize the photographs which are shown earliest. The analyzed data revealed that control group has ahighest mean score of facial recognition than experimental group. The results were discussed in the present research.

Keywords: experimentally induced stress, facial recognition, cognition, security personnel

Procedia PDF Downloads 256
24867 Breath Ethanol Imaging System Using Real Time Biochemical Luminescence for Evaluation of Alcohol Metabolic Capacity

Authors: Xin Wang, Munkbayar Munkhjargal, Kumiko Miyajima, Takahiro Arakawa, Kohji Mitsubayashi

Abstract:

The measurement of gaseous ethanol plays an important role of evaluation of alcohol metabolic capacity in clinical and forensic analysis. A 2-dimensional visualization system for gaseous ethanol was constructed and tested in visualization of breath and transdermal alcohol. We demonstrated breath ethanol measurement using developed high-sensitive visualization system. The concentration of breath ethanol calculated with the imaging signal was significantly different between the volunteer subjects of ALDH2 (+) and (-).

Keywords: breath ethanol, ethnaol imaging, biochemical luminescence, alcohol metabolism

Procedia PDF Downloads 346
24866 Control the Flow of Big Data

Authors: Shizra Waris, Saleem Akhtar

Abstract:

Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.

Keywords: computer, it community, industry, big data

Procedia PDF Downloads 188
24865 High Performance Computing and Big Data Analytics

Authors: Branci Sarra, Branci Saadia

Abstract:

Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.

Keywords: high performance computing, HPC, big data, data analysis

Procedia PDF Downloads 515
24864 A Landscape of Research Data Repositories in Re3data.org Registry: A Case Study of Indian Repositories

Authors: Prashant Shrivastava

Abstract:

The purpose of this study is to explore re3dat.org registry to identify research data repositories registration workflow process. Further objective is to depict a graph for present development of research data repositories in India. Preliminarily with an approach to understand re3data.org registry framework and schema design then further proceed to explore the status of research data repositories of India in re3data.org registry. Research data repositories are getting wider relevance due to e-research concepts. Now available registry re3data.org is a good tool for users and researchers to identify appropriate research data repositories as per their research requirements. In Indian environment, a compatible National Research Data Policy is the need of the time to boost the management of research data. Registry for Research Data Repositories is a crucial tool to discover specific information in specific domain. Also, Research Data Repositories in India have not been studied. Re3data.org registry and status of Indian research data repositories both discussed in this study.

Keywords: research data, research data repositories, research data registry, re3data.org

Procedia PDF Downloads 318