Search results for: digital evidence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1456

Search results for: digital evidence

1006 Diagnosing Dangerous Arrhythmia of Patients by Automatic Detecting of QRS Complexes in ECG

Authors: Jia-Rong Yeh, Ai-Hsien Li, Jiann-Shing Shieh, Yen-An Su, Chi-Yu Yang

Abstract:

In this paper, an automatic detecting algorithm for QRS complex detecting was applied for analyzing ECG recordings and five criteria for dangerous arrhythmia diagnosing are applied for a protocol type of automatic arrhythmia diagnosing system. The automatic detecting algorithm applied in this paper detected the distribution of QRS complexes in ECG recordings and related information, such as heart rate and RR interval. In this investigation, twenty sampled ECG recordings of patients with different pathologic conditions were collected for off-line analysis. A combinative application of four digital filters for bettering ECG signals and promoting detecting rate for QRS complex was proposed as pre-processing. Both of hardware filters and digital filters were applied to eliminate different types of noises mixed with ECG recordings. Then, an automatic detecting algorithm of QRS complex was applied for verifying the distribution of QRS complex. Finally, the quantitative clinic criteria for diagnosing arrhythmia were programmed in a practical application for automatic arrhythmia diagnosing as a post-processor. The results of diagnoses by automatic dangerous arrhythmia diagnosing were compared with the results of off-line diagnoses by experienced clinic physicians. The results of comparison showed the application of automatic dangerous arrhythmia diagnosis performed a matching rate of 95% compared with an experienced physician-s diagnoses.

Keywords: Signal processing, electrocardiography (ECG), QRS complex, arrhythmia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517
1005 An E-Maintenance IoT Sensor Node Designed for Fleets of Diverse Heavy-Duty Vehicles

Authors: George Charkoftakis, Panagiotis Liosatos, Nicolas-Alexander Tatlas, Dimitrios Goustouridis, Stelios M. Potirakis

Abstract:

E-maintenance is a relatively recent concept, generally referring to maintenance management by monitoring assets over the Internet. One of the key links in the chain of an e-maintenance system is data acquisition and transmission. Specifically for the case of a fleet of heavy-duty vehicles, where the main challenge is the diversity of the vehicles and vehicle-embedded self-diagnostic/reporting technologies, the design of the data acquisition and transmission unit is a demanding task. This is clear if one takes into account that a heavy-vehicles fleet assortment may range from vehicles with only a limited number of analog sensors monitored by dashboard light indicators and gauges to vehicles with plethora of sensors monitored by a vehicle computer producing digital reporting. The present work proposes an adaptable internet of things (IoT) sensor node that is capable of addressing this challenge. The proposed sensor node architecture is based on the increasingly popular single-board computer – expansion boards approach. In the proposed solution, the expansion boards undertake the tasks of position identification, cellular connectivity, connectivity to the vehicle computer, and connectivity to analog and digital sensors by means of a specially targeted design of expansion board. Specifically, the latter offers a number of adaptability features to cope with the diverse sensor types employed in different vehicles. In standard mode, the IoT sensor node communicates to the data center through cellular network, transmitting all digital/digitized sensor data, IoT device identity and position. Moreover, the proposed IoT sensor node offers connectivity, through WiFi and an appropriate application, to smart phones or tablets allowing the registration of additional vehicle- and driver-specific information and these data are also forwarded to the data center. All control and communication tasks of the IoT sensor node are performed by dedicated firmware.

Keywords: IoT sensor nodes, e-maintenance, single-board computers, sensor expansion boards, on-board diagnostics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595
1004 Ethereum Based Smart Contracts for Trade and Finance

Authors: Rishabh Garg

Abstract:

Traditionally, business parties build trust with a centralized operating mechanism, such as payment by letter of credit. However, the increase in cyber-attacks and malicious hacking has jeopardized business operations and finance practices. Emerging markets, due to their high banking risks and the large presence of digital financing, are looking for technology that enables transparency and traceability of any transaction in trade, finance or supply chain management. Blockchain systems, in the absence of any central authority, enable transactions across the globe with the help of decentralized applications. DApps consist of a front-end, a blockchain back-end, and middleware, that is, the code that connects the two. The front-end can be a sophisticated web app or mobile app, which is used to implement the functions/methods on the smart contract. Web apps can employ technologies such as HTML, CSS, React and Express. In this wake, fintech and blockchain products are popping up in brokerages, digital wallets, exchanges, post-trade clearance, settlement, middleware, infrastructure and base protocols. The present paper provides a technology driven solution, financial inclusion and innovative working paradigm for business and finance.

Keywords: Authentication, blockchain, channel, cryptography, DApps, data portability, Decentralized Public Key Infrastructure, Ethereum, hash function, Hashgraph, Privilege creep, Proof of Work algorithm, revocation, storage variables, Zero Knowledge Proof.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 583
1003 Mathematical Analysis of EEG of Patients with Non-fatal Nonspecific Diffuse Encephalitis

Authors: Mukesh Doble, Sunil K Narayan

Abstract:

Diffuse viral encephalitis may lack fever and other cardinal signs of infection and hence its distinction from other acute encephalopathic illnesses is challenging. Often, the EEG changes seen routinely are nonspecific and reflect diffuse encephalopathic changes only. The aim of this study was to use nonlinear dynamic mathematical techniques for analyzing the EEG data in order to look for any characteristic diagnostic patterns in diffuse forms of encephalitis.It was diagnosed on clinical, imaging and cerebrospinal fluid criteria in three young male patients. Metabolic and toxic encephalopathies were ruled out through appropriate investigations. Digital EEGs were done on the 3rd to 5th day of onset. The digital EEGs of 5 male and 5 female age and sex matched healthy volunteers served as controls.Two sample t-test indicated that there was no statistically significant difference between the average values in amplitude between the two groups. However, the standard deviation (or variance) of the EEG signals at FP1-F7 and FP2-F8 are significantly higher for the patients than the normal subjects. The regularisation dimension is significantly less for the patients (average between 1.24-1.43) when compared to the normal persons (average between 1.41-1.63) for the EEG signals from all locations except for the Fz-Cz signal. Similarly the wavelet dimension is significantly less (P = 0.05*) for the patients (1.122) when compared to the normal person (1.458). EEGs are subdued in the case of the patients with presence of uniform patterns, manifested in the values of regularisation and wavelet dimensions, when compared to the normal person, indicating a decrease in chaotic nature.

Keywords: Chaos, Diffuse encephalitis, Electroencephalogram, Fractal dimension, Fourier spectrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209
1002 An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference

Authors: Ayman A. Aly, Abdallah A. Alshnnaway

Abstract:

The general idea behind the filter is to average a pixel using other pixel values from its neighborhood, but simultaneously to take care of important image structures such as edges. The main concern of the proposed filter is to distinguish between any variations of the captured digital image due to noise and due to image structure. The edges give the image the appearance depth and sharpness. A loss of edges makes the image appear blurred or unfocused. However, noise smoothing and edge enhancement are traditionally conflicting tasks. Since most noise filtering behaves like a low pass filter, the blurring of edges and loss of detail seems a natural consequence. Techniques to remedy this inherent conflict often encompass generation of new noise due to enhancement. In this work a new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of three stages. (1) Define fuzzy sets in the input space to computes a fuzzy derivative for eight different directions (2) construct a set of IFTHEN rules by to perform fuzzy smoothing according to contributions of neighboring pixel values and (3) define fuzzy sets in the output space to get the filtered and edged image. Experimental results are obtained to show the feasibility of the proposed approach with two dimensional objects.

Keywords: Additive noise, edge preserving filtering, fuzzy image filtering, noise reduction, two dimensional mechanical images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
1001 Visitors’ Attitude towards the Service Marketing Mix and Frequency of Visits to Bangpu Recreation Centre, Thailand

Authors: Siri-Orn Champatong

Abstract:

This research paper was aimed to examine the relationship between visitors’ attitude towards the service marketing mix and visitors’ frequency of visit to Bangpu Recreation Centre. Based on a large and uncalculated population, the number of samples was calculated according to the formula to obtain a total of 385 samples. In collecting the samples, systematic random sampling was applied and by using of a Likert five-scale questionnaire for, a total of 21 days to collect the needed information. Mean, Standard Deviation, and Pearson’s basic statistical correlations were utilized in analyzing the data. This study discovered a high level of visitors’ attitude product and service of Bangpu Recreation Centre, price, place, promotional activities, people who provided service and physical evidence of the centre. The attitude towards process of service was discovered to be at a medium level. Additionally, the finding of an examination of a relationship between visitors’ attitude towards service marketing mix and visitors’ frequency of visit to Bangpu Recreation Centre presented that product and service, people, physical evidence and process of service provision showed a relationship with the visitors’ frequency of visit to the centre per year.

Keywords: Frequency of Visit, Visitor, Service Marketing Mix, Bangpu Recreation Centre.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
1000 An Evaluation of Digital Elevation Models to Short-Term Monitoring of a High Energy Barrier Island, Northeast Brazil

Authors: Venerando E. Amaro, Francisco Gabriel F. de Lima, Marcelo S.T. Santos

Abstract:

The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.

Keywords: DEM, GNSS, short-term monitoring, Brazil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2628
999 Hiding Data in Images Using PCP

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
998 Investigating Polynomial Interpolation Functions for Zooming Low Resolution Digital Medical Images

Authors: Maninder Pal

Abstract:

Medical digital images usually have low resolution because of nature of their acquisition. Therefore, this paper focuses on zooming these images to obtain better level of information, required for the purpose of medical diagnosis. For this purpose, a strategy for selecting pixels in zooming operation is proposed. It is based on the principle of analog clock and utilizes a combination of point and neighborhood image processing. In this approach, the hour hand of clock covers the portion of image to be processed. For alignment, the center of clock points at middle pixel of the selected portion of image. The minute hand is longer in length, and is used to gain information about pixels of the surrounding area. This area is called neighborhood pixels region. This information is used to zoom the selected portion of the image. The proposed algorithm is implemented and its performance is evaluated for many medical images obtained from various sources such as X-ray, Computerized Tomography (CT) scan and Magnetic Resonance Imaging (MRI). However, for illustration and simplicity, the results obtained from a CT scanned image of head is presented. The performance of algorithm is evaluated in comparison to various traditional algorithms in terms of Peak signal-to-noise ratio (PSNR), maximum error, SSIM index, mutual information and processing time. From the results, the proposed algorithm is found to give better performance than traditional algorithms.

Keywords: Zooming, interpolation, medical images, resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
997 Elections, Checks and Balances, and Government Expenditures: Empirical Evidence for Japan, South Korea, and Taiwan

Authors: Yuan-Hong Ho, Chiung-Ju Huang

Abstract:

Previous studies on political budget cycles (PBCs) implicitly assume the executive has full discretion power over fiscal policy, neglecting the role of checks and balances of the legislature. This paper goes beyond traditional PBCs models and sheds light on the case study of Japan, South Korea, and Taiwan over the 1988-2007 periods. Based on the results, we find no evidence of electoral impacts on the public expenditures in South Korean and Taiwan's congressional elections. We also noted that PBCs are found on Taiwan-s government expenditures during our sample periods. Furthermore, the results also show that Japan-s legislature has a significant checks and balances on government-s expenditures. However, empirical results show that the legislature veto player in Taiwan neither has effect on the reduction of public expenditures, nor has the moderating effect over Taiwan-s political budget cycles, albeit that they are statistically insignificant.We suggest that the existence of PBCs in Taiwan is due to a weaker systemof checks and balances. Our conjecture is that Taiwan either has no legislative veto player or has observed low compliance to the law during the time period examined in our study.

Keywords: Checks and balances, compliance to the law, political budget cycles, veto player.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
996 Monte Carlo and Biophysics Analysis in a Criminal Trial

Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano

Abstract:

In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.

Keywords: Biophysical analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 614
995 Swarm Intelligence based Optimal Linear Phase FIR High Pass Filter Design using Particle Swarm Optimization with Constriction Factor and Inertia Weight Approach

Authors: Sangeeta Mandal, Rajib Kar, Durbadal Mandal, Sakti Prasad Ghoshal

Abstract:

In this paper, an optimal design of linear phase digital high pass finite impulse response (FIR) filter using Particle Swarm Optimization with Constriction Factor and Inertia Weight Approach (PSO-CFIWA) has been presented. In the design process, the filter length, pass band and stop band frequencies, feasible pass band and stop band ripple sizes are specified. FIR filter design is a multi-modal optimization problem. The conventional gradient based optimization techniques are not efficient for digital filter design. Given the filter specifications to be realized, the PSO-CFIWA algorithm generates a set of optimal filter coefficients and tries to meet the ideal frequency response characteristic. In this paper, for the given problem, the designs of the optimal FIR high pass filters of different orders have been performed. The simulation results have been compared to those obtained by the well accepted algorithms such as Parks and McClellan algorithm (PM), genetic algorithm (GA). The results justify that the proposed optimal filter design approach using PSOCFIWA outperforms PM and GA, not only in the accuracy of the designed filter but also in the convergence speed and solution quality.

Keywords: FIR Filter; PSO-CFIWA; PSO; Parks and McClellanAlgorithm, Evolutionary Optimization Technique; MagnitudeResponse; Convergence; High Pass Filter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
994 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: Logistic Regression LoR, Kernel Density Estimator KDE, Handwriting, Confidence Interval, Repeatability, Reproducibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 472
993 Sperm Identification Using Elliptic Model and Tail Detection

Authors: Vahid Reza Nafisi, Mohammad Hasan Moradi, Mohammad Hosain Nasr-Esfahani

Abstract:

The conventional assessment of human semen is a highly subjective assessment, with considerable intra- and interlaboratory variability. Computer-Assisted Sperm Analysis (CASA) systems provide a rapid and automated assessment of the sperm characteristics, together with improved standardization and quality control. However, the outcome of CASA systems is sensitive to the method of experimentation. While conventional CASA systems use digital microscopes with phase-contrast accessories, producing higher contrast images, we have used raw semen samples (no staining materials) and a regular light microscope, with a digital camera directly attached to its eyepiece, to insure cost benefits and simple assembling of the system. However, since the accurate finding of sperms in the semen image is the first step in the examination and analysis of the semen, any error in this step can affect the outcome of the analysis. This article introduces and explains an algorithm for finding sperms in low contrast images: First, an image enhancement algorithm is applied to remove extra particles from the image. Then, the foreground particles (including sperms and round cells) are segmented form the background. Finally, based on certain features and criteria, sperms are separated from other cells.

Keywords: Computer-Assisted Sperm Analysis (CASA), Sperm identification, Tail detection, Elliptic shape model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
992 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in VerilogHDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: Auto-encoder, Behavior model simulation, Digital hardware design, Pre-route simulation, Unsupervised feature learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2690
991 Forensic Science in Dr. Jekyll and Mr. Hyde: Trails of Utterson's Quest

Authors: Kyu-Jeoung Lee, Jae-Uk Choo

Abstract:

This paper focuses on investigating The Strange Case of Dr Jekyll and Mr Hyde from Utterson’s point of view, referring to: Gabriel John Utterson, a central character in the book. Utterson is no different from a forensic investigator, as he tries to collect evidence on the mysterious Mr. Hyde’s relationship to Dr. Jekyll. From Utterson's perspective, Jekyll is the 'victim' of a potential scandal and blackmail, and Hyde is the 'suspect' of a possible 'crime'. Utterson intends to figure out Hyde's identity, connect his motive with his actions, and gather witness accounts. During Utterson’s quest, the outside materials available to him along with the social backgrounds of Hyde and Jekyll will be analyzed. The archives left from Jekyll’s chamber will also play a part providing evidence. Utterson will investigate, based on what he already knows about Jekyll his whole life, and how Jekyll had acted in his eyes until he was gone, and finding out possible explanations for Jekyll's actions. The relationship between Jekyll and Hyde becomes the major question, as the social background offers clues pointing in the direction of illegitimacy and prostitution. There is still a possibility that Jekyll and Hyde were, in fact, completely different people. Utterson received a full statement and confession from Jekyll himself at the end of the story, which gives the reader the possible truth on what happened. Stevenson’s Dr. Jekyll and Mr. Hyde led readers, as it did Utterson, to find the connection between Hyde and Jekyll using methods of history, culture, and science. Utterson's quest to uncover Hyde shows an example of applying the various fields to in his act to see if Hyde's inheritance was legal. All of this taken together could technically be considered forensic investigation.

Keywords: Dr. Jekyll and Mr. Hyde, forensic investigation, illegitimacy, prostitution, Robert Louis Stevenson.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829
990 The Impact of Size of the Regional Economic Blocs to the Country’s Flows of Trade: Evidence from COMESA, EAC and Tanzania

Authors: Mosses E. Lufuke, Lorna M. Kamau

Abstract:

This paper attempted to assess whether the size of the regional economic bloc has an impact to the flow of trade to a particular country. Two different sized blocs (COMESA and EAC) and one country (Tanzania) have been used as the point of references. Using the results from of the analyses, the paper also was anticipated to establish whether it was rational for Tanzania to withdraw its membership from COMESA (the larger bloc) to join EAC (the small one). Gravity model has been used to estimate the relationship between the variables, from which the bilateral trade flows between Tanzania and the eighteen member countries of the two blocs (COMESA and EAC) was employed for the time between 2000 and 2013. In the model, the dummy variable for regional bloc (bloc) at which the Tanzania trade partner countries belong are also added to the model to understand which trade bloc exhibit higher trade flow with Tanzania. From the findings, it was noted that over the period of study (2000-2013) Tanzania acknowledged more than 257% of trade volume in EAC than in COMESA. Conclusive, it was noted that the flow of trade is explained by many other variables apart from the size of regional bloc; and that the size by itself offer insufficient evidence in causality relationship. The paper therefore remain neutral on such staggered switching decision since more analyses are required to establish the country’s trade flow, especially when if it had been in multiple membership of COMESA and EAC.

Keywords: Economic Bloc, Flow of Trade, Size of Bloc, Switching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1207
989 Saving Lives: Alternative Approaches to Reducing Gun Violence

Authors: Angie M. Wolf, Angie Del Prado Lippman, DeVone Boggan, Caroline Glesmann, Estivaliz Castro

Abstract:

This paper highlights an innovative and nontraditional violence prevention program that is making a noticeable impact in what was once one of the country’s most violent communities. With unique and tailored strategies, the Operation Peacemaker Fellowship, established in Richmond, California, combines components of evidence-based practices with a community-oriented focus on relationships and mentoring to fill a gap in services and increase community safety. In an effort to highlight these unique strategies and provide a blueprint for other communities with violent crime problems, the authors of this paper hope to clearly delineate how one community is moving forward with vanguard approaches to invest in the lives of young men who once were labeled their community’s most violent, even most deadly, youth. The impact of this program is evidenced through the fellows’ own voices as they illuminate the experience of being in the Fellowship. In interviews, fellows describe how participating in this program has transformed their lives and the lives of those they love. The authors of this article spent more than two years researching this Fellowship program in order to conduct an evaluation of it and, ultimately, to demonstrate how this program is a testament to the power of relationships and love combined with evidence-based practices, consequently enriching the lives of youth and the community that embraces them.

Keywords: Community violence, firearm violence, interventions for violent crime, violence prevention.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
988 Educational use of Interactive Multimedia based on Museum Collection

Authors: Ji-Hye Lee, Jongdeok Kim

Abstract:

This research investigates the use of digital technology namely interactive multimedia in effective art education provided by museum. Several multimedia experience examples created for art education are study case subjected to assistance audiences- learning within the context of existing theory in the field of interactive multimedia.

Keywords: E-learning, Fine Arts, Interactivity, Multimedia

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
987 Effect of Peak-to-Average Power Ratio Reduction on the Multicarrier Communication System Performance Parameters

Authors: Sanjay Singh, M Sathish Kumar, H. S Mruthyunjaya

Abstract:

Multicarrier transmission system such as Orthogonal Frequency Division Multiplexing (OFDM) is a promising technique for high bit rate transmission in wireless communication system. OFDM is a spectrally efficient modulation technique that can achieve high speed data transmission over multipath fading channels without the need for powerful equalization techniques. However the price paid for this high spectral efficiency and less intensive equalization is low power efficiency. OFDM signals are very sensitive to nonlinear effects due to the high Peak-to-Average Power Ratio (PAPR), which leads to the power inefficiency in the RF section of the transmitter. This paper investigates the effect of PAPR reduction on the performance parameter of multicarrier communication system. Performance parameters considered are power consumption of Power Amplifier (PA) and Digital-to-Analog Converter (DAC), power amplifier efficiency, SNR of DAC and BER performance of the system. From our analysis it is found that irrespective of PAPR reduction technique being employed, the power consumption of PA and DAC reduces and power amplifier efficiency increases due to reduction in PAPR. Moreover, it has been shown that for a given BER performance the requirement of Input-Backoff (IBO) reduces with reduction in PAPR.

Keywords: BER, Crest Factor (CF), Digital-to-Analog Converter(DAC), Input-Backoff (IBO), Orthogonal Frequency Division Multiplexing(OFDM), Peak-to-Average Power Ratio (PAPR), PowerAmplifier efficiency, SNR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3286
986 A Methodology to Virtualize Technical Engineering Laboratories: MastrLAB-VR

Authors: Ivana Scidà, Francesco Alotto, Anna Osello

Abstract:

Due to the importance given today to innovation, the education sector is evolving thanks digital technologies. Virtual Reality (VR) can be a potential teaching tool offering many advantages in the field of training and education, as it allows to acquire theoretical knowledge and practical skills using an immersive experience in less time than the traditional educational process. These assumptions allow to lay the foundations for a new educational environment, involving and stimulating for students. Starting from the objective of strengthening the innovative teaching offer and the learning processes, the case study of the research concerns the digitalization of MastrLAB, High Quality Laboratory (HQL) belonging to the Department of Structural, Building and Geotechnical Engineering (DISEG) of the Polytechnic of Turin, a center specialized in experimental mechanical tests on traditional and innovative building materials and on the structures made with them. The MastrLAB-VR has been developed, a revolutionary innovative training tool designed with the aim of educating the class in total safety on the techniques of use of machinery, thus reducing the dangers arising from the performance of potentially dangerous activities. The virtual laboratory, dedicated to the students of the Building and Civil Engineering Courses of the Polytechnic of Turin, has been projected to simulate in an absolutely realistic way the experimental approach to the structural tests foreseen in their courses of study: from the tensile tests to the relaxation tests, from the steel qualification tests to the resilience tests on elements at environmental conditions or at characterizing temperatures. The research work proposes a methodology for the virtualization of technical laboratories through the application of Building Information Modelling (BIM), starting from the creation of a digital model. The process includes the creation of an independent application, which with Oculus Rift technology will allow the user to explore the environment and interact with objects through the use of joypads. The application has been tested in prototype way on volunteers, obtaining results related to the acquisition of the educational notions exposed in the experience through a virtual quiz with multiple answers, achieving an overall evaluation report. The results have shown that MastrLAB-VR is suitable for both beginners and experts and will be adopted experimentally for other laboratories of the University departments.

Keywords: Building Information Modelling, digital learning, education, virtual laboratory, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 843
985 Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

Authors: Muhammad Sajjad, Naveed Khattak, Noman Jafri

Abstract:

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.

Keywords: Adaptive, digital image processing, imagemagnification, interpolation, geometrical shapes, qualitative &quantitative analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1800
984 Adapting Cities Name with ICT and Countries Interested in the Smart City

Authors: Qasim Hamakhurshid Hamamurad, Normal Mat Jusoh, Uznir Ujang

Abstract:

The concept of the city with an infrastructure of Information and Communication Technology (ICT) embraces several definitions depending on the meanings of the word "smart" which include: intelligent city, smart city, knowledge city, ubiquitous city, sustainable city, and digital city. Many definitions of the city exist, but this study explores which one has been universally acknowledged. From the literature analysis, it emerges that the term smart city is the most used in the articles to show the smartness of a city. This paper shares exploration of the research from the seven main website digital databases and journals focusing on the smart city from January 2015 to February 2020 to: (a) Time research, to examine the causes of the smart city phenomenon and other concept literature in the last five years; (b) Review of words, to see how and where the smart city specification and relation of different definitions are implemented; (c) Geographical research to consider where smart cities' greatest concentrations are in the world and determine if Malaysians are interacting with the smart city; and (d) How many papers are published in all of Malaysia from 2015 to 2020 about smart cities. Three steps are followed to accomplish the aim of this study: (1) The analysis which covered a systematic literature review search strategy to gather a representative sub-set of papers on the smart city and other definitions utilizing GoogleScholar, Elsevier, Scopus, ScienceDirect, IEEEXplore, WebofScience, and Springer between January 2015-February 2020; (2) The formation of a bibliometric map based on the bibliometric evaluation using the mapping technique VOSviewer to visualize differences; (3) VOSviewer application program to build initial clusters. The bibliometric analytical findings targeted the word harmony.

Keywords: Bibliometric research, smart city, ICT, VOSviewer, urban modernization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1071
983 Reconsidering the Palaeo-Environmental Reconstruction of the Wet Zone of Sri Lanka: A Zooarchaeological Perspective

Authors: Kalangi Rodrigo, Kelum Manamendra-Arachchi

Abstract:

Bones, teeth, and shells have been acknowledged over the last two centuries as evidence of chronology, Palaeo-environment, and human activity. Faunal traces are valid evidence of past situations because they have properties that have not changed over long periods. Sri Lanka has been known as an Island, which has a diverse variety of prehistoric occupation among ecological zones. Defining the Paleoecology of the past societies has been an archaeological thought developed in the 1960s. It is mainly concerned with the reconstruction from available geological and biological evidence of past biota, populations, communities, landscapes, environments, and ecosystems. This early and persistent human fossil, technical, and cultural florescence, as well as a collection of well-preserved tropical-forest rock shelters with associated 'on-site ' Palaeoenvironmental records, makes Sri Lanka a central and unusual case study to determine the extent and strength of early human tropical forest encounters. Excavations carried out in prehistoric caves in the low country wet zone has shown that in the last 50,000 years, the temperature in the lowland rainforests has not exceeded 5 degrees. Based on Semnopithecus Priam (Gray Langur) remains unearthed from wet zone prehistoric caves, it has been argued periods of momentous climate changes during the Last Glacial Maximum (LGM) and Terminal Pleistocene/Early Holocene boundary, with a recognizable preference for semi-open ‘Intermediate’ rainforest or edges. Continuous genus Acavus and Oligospira occupation along with uninterrupted horizontal pervasive of Canarium sp. (‘kekuna’ nut) have proven that temperatures in the lowland rain forests have not changed by at least 5 °C over the last 50,000 years. Site catchment or territorial analysis cannot be any longer defensible, due to time-distance based factors as well as optimal foraging theory failed as a consequence of prehistoric people were aware of the decrease in cost-benefit ratio and located sites, and generally played out a settlement strategy that minimized the ratio of energy expended to energy produced.

Keywords: Palaeo-environment, palaeo-ecology, palaeo-climate, prehistory, zooarchaeology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 740
982 Constitutive Equations for Human Saphenous Vein Coronary Artery Bypass Graft

Authors: Hynek Chlup, Lukas Horny, Rudolf Zitny, Svatava Konvickova, Tomas Adamek

Abstract:

Coronary artery bypass grafts (CABG) are widely studied with respect to hemodynamic conditions which play important role in presence of a restenosis. However, papers which concern with constitutive modeling of CABG are lacking in the literature. The purpose of this study is to find a constitutive model for CABG tissue. A sample of the CABG obtained within an autopsy underwent an inflation–extension test. Displacements were recoredered by CCD cameras and subsequently evaluated by digital image correlation. Pressure – radius and axial force – elongation data were used to fit material model. The tissue was modeled as onelayered composite reinforced by two families of helical fibers. The material is assumed to be locally orthotropic, nonlinear, incompressible and hyperelastic. Material parameters are estimated for two strain energy functions (SEF). The first is classical exponential. The second SEF is logarithmic which allows interpretation by means of limiting (finite) strain extensibility. Presented material parameters are estimated by optimization based on radial and axial equilibrium equation in a thick-walled tube. Both material models fit experimental data successfully. The exponential model fits significantly better relationship between axial force and axial strain than logarithmic one.

Keywords: Constitutive model, coronary artery bypass graft, digital image correlation, fiber reinforced composite, inflation test, saphenous vein.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643
981 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet

Authors: Amir Moslemi, Amir Movafeghi, Shahab Moradi

Abstract:

One of the most important challenging factors in medical images is nominated as noise. Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjects to low quality due to the noise. Quality of CT images is dependent on absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete Wavelet Transform (DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).

Keywords: Computed Tomography (CT), noise reduction, curve-let, contour-let, Signal to Noise Peak-Peak Ratio (PSNR), Structure Similarity (Ssim), Absorbed Dose to Patient (ADP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2921
980 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: Airborne laser scanning, digital terrain models, filtering, forested areas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 718
979 Values as a Predictor of Cyber-bullying Among Secondary School Students

Authors: Bülent Dilmaç, Didem Aydoğan

Abstract:

The use of new technologies such internet (e-mail, chat rooms) and cell phones has steeply increased in recent years. Especially among children and young people, use of technological tools and equipments is widespread. Although many teachers and administrators now recognize the problem of school bullying, few are aware that students are being harassed through electronic communication. Referred to as electronic bullying, cyber bullying, or online social cruelty, this phenomenon includes bullying through email, instant messaging, in a chat room, on a website, or through digital messages or images sent to a cell phone. Cyber bullying is defined as causing deliberate/intentional harm to others using internet or other digital technologies. It has a quantitative research design nd uses relational survey as its method. The participants consisted of 300 secondary school students in the city of Konya, Turkey. 195 (64.8%) participants were female and 105 (35.2%) were male. 39 (13%) students were at grade 1, 187 (62.1%) were at grade 2 and 74 (24.6%) were at grade 3. The “Cyber Bullying Question List" developed by Ar─▒cak (2009) was given to students. Following questions about demographics, a functional definition of cyber bullying was provided. In order to specify students- human values, “Human Values Scale (HVS)" developed by Dilmaç (2007) for secondary school students was administered. The scale consists of 42 items in six dimensions. Data analysis was conducted by the primary investigator of the study using SPSS 14.00 statistical analysis software. Descriptive statistics were calculated for the analysis of students- cyber bullying behaviour and simple regression analysis was conducted in order to test whether each value in the scale could explain cyber bullying behaviour.

Keywords: Cyber bullying, Values, Secondary SchoolStudents

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3826
978 Identification of Seat Belt Wearing Compliance Associate Factors in Malaysia: Evidence-based Approach

Authors: L. Fauziana, M. F. Siti Atiqah, Z. A. Ahmad Noor Syukri

Abstract:

The aim of the study was to identify seat belt wearing factor among road users in Malaysia. Evidence-based approach through in-depth crash investigation was utilised to determine the intended objectives. The objective was scoped into crashes investigated by Malaysian Institute of Road Safety Research (MIROS) involving passenger vehicles within 2007 and 2010. Crash information of a total of 99 crash cases involving 240 vehicles and 864 occupants were obtained during the study period. Statistical test and logistic regression analysis have been performed. Results of the analysis revealed that gender, seat position and age were associated with seat belt wearing compliance in Malaysia. Males are 97.6% more likely to wear seat belt compared to females (95% CI 1.317 to 2.964). By seat position, the finding indicates that frontal occupants were 82 times more likely to be wearing seat belt (95% CI 30.199 to 225.342) as compared to rear occupants. It is also important to note that the odds of seat belt wearing increased by about 2.64% (95% CI 1.0176 to 1.0353) for every one year increase in age. This study is essential in understanding the Malaysian tendency in belting up while being occupied in a vehicle. The factors highlighted in this study should be emphasized in road safety education in order to increase seat belt wearing rate in this country and ultimately in preventing deaths due to road crashes.

Keywords: crash investigation, risk compensation, road safety, seat belt wearing, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1961
977 Spatial Query Localization Method in Limited Reference Point Environment

Authors: Victor Krebss

Abstract:

Task of object localization is one of the major challenges in creating intelligent transportation. Unfortunately, in densely built-up urban areas, localization based on GPS only produces a large error, or simply becomes impossible. New opportunities arise for the localization due to the rapidly emerging concept of a wireless ad-hoc network. Such network, allows estimating potential distance between these objects measuring received signal level and construct a graph of distances in which nodes are the localization objects, and edges - estimates of the distances between pairs of nodes. Due to the known coordinates of individual nodes (anchors), it is possible to determine the location of all (or part) of the remaining nodes of the graph. Moreover, road map, available in digital format can provide localization routines with valuable additional information to narrow node location search. However, despite abundance of well-known algorithms for solving the problem of localization and significant research efforts, there are still many issues that currently are addressed only partially. In this paper, we propose localization approach based on the graph mapped distances on the digital road map data basis. In fact, problem is reduced to distance graph embedding into the graph representing area geo location data. It makes possible to localize objects, in some cases even if only one reference point is available. We propose simple embedding algorithm and sample implementation as spatial queries over sensor network data stored in spatial database, allowing employing effectively spatial indexing, optimized spatial search routines and geometry functions.

Keywords: Intelligent Transportation System, Sensor Network, Localization, Spatial Query, GIS, Graph Embedding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537