Search results for: radiation processing
2282 Design and Implementation of an Effective Machine Learning Approach to Crime Prediction and Prevention
Authors: Ashish Kumar, Kaptan Singh, Amit Saxena
Abstract:
Today, it is believed that crimes have the greatest impact on a person's ability to progress financially and personally. Identifying places where individuals shouldn't go is crucial for preventing crimes and is one of the key considerations. As society and technologies have advanced significantly, so have crimes and the harm they wreak. When there is a concentration of people in one place and changes happen quickly, it is even harder to prevent. Because of this, many crime prevention strategies have been embraced as a component of the development of smart cities in numerous cities. However, crimes can occur anywhere; all that is required is to identify the pattern of their occurrences, which will help to lower the crime rate. In this paper, an analysis related to crime has been done; information related to crimes is collected from all over India that can be accessed from anywhere. The purpose of this paper is to investigate the relationship between several factors and India's crime rate. The review has covered information related to every state of India and their associated regions of the period going in between 2001- 2014. However various classes of violations have a marginally unique scope over the years.Keywords: K-nearest neighbor, random forest, decision tree, pre-processing
Procedia PDF Downloads 952281 High-Resolution Facial Electromyography in Freely Behaving Humans
Authors: Lilah Inzelberg, David Rand, Stanislav Steinberg, Moshe David Pur, Yael Hanein
Abstract:
Human facial expressions carry important psychological and neurological information. Facial expressions involve the co-activation of diverse muscles. They depend strongly on personal affective interpretation and on social context and vary between spontaneous and voluntary activations. Smiling, as a special case, is among the most complex facial emotional expressions, involving no fewer than 7 different unilateral muscles. Despite their ubiquitous nature, smiles remain an elusive and debated topic. Smiles are associated with happiness and greeting on one hand and anger or disgust-masking on the other. Accordingly, while high-resolution recording of muscle activation patterns, in a non-interfering setting, offers exciting opportunities, it remains an unmet challenge, as contemporary surface facial electromyography (EMG) methodologies are cumbersome, restricted to the laboratory settings, and are limited in time and resolution. Here we present a wearable and non-invasive method for objective mapping of facial muscle activation and demonstrate its application in a natural setting. The technology is based on a recently developed dry and soft electrode array, specially designed for surface facial EMG technique. Eighteen healthy volunteers (31.58 ± 3.41 years, 13 females), participated in the study. Surface EMG arrays were adhered to participant left and right cheeks. Participants were instructed to imitate three facial expressions: closing the eyes, wrinkling the nose and smiling voluntary and to watch a funny video while their EMG signal is recorded. We focused on muscles associated with 'enjoyment', 'social' and 'masked' smiles; three categories with distinct social meanings. We developed a customized independent component analysis algorithm to construct the desired facial musculature mapping. First, identification of the Orbicularis oculi and the Levator labii superioris muscles was demonstrated from voluntary expressions. Second, recordings of voluntary and spontaneous smiles were used to locate the Zygomaticus major muscle activated in Duchenne and non-Duchenne smiles. Finally, recording with a wireless device in an unmodified natural work setting revealed expressions of neutral, positive and negative emotions in face-to-face interaction. The algorithm outlined here identifies the activation sources in a subject-specific manner, insensitive to electrode placement and anatomical diversity. Our high-resolution and cross-talk free mapping performances, along with excellent user convenience, open new opportunities for affective processing and objective evaluation of facial expressivity, objective psychological and neurological assessment as well as gaming, virtual reality, bio-feedback and brain-machine interface applications.Keywords: affective expressions, affective processing, facial EMG, high-resolution electromyography, independent component analysis, wireless electrodes
Procedia PDF Downloads 2482280 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images
Authors: A. Biran, P. Sobhe Bidari, A. Almazroe, V. Lakshminarayanan, K. Raahemifar
Abstract:
Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.Keywords: diabetic retinopathy, fundus images, STARE, Gabor filter, support vector machine
Procedia PDF Downloads 2942279 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements
Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath
Abstract:
Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing
Procedia PDF Downloads 1772278 Nonparametric Copula Approximations
Authors: Serge Provost, Yishan Zang
Abstract:
Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation
Procedia PDF Downloads 752277 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission
Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong
Abstract:
Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU
Procedia PDF Downloads 2922276 Feasibility of Voluntary Deep Inspiration Breath-Hold Radiotherapy Technique Implementation without Deep Inspiration Breath-Hold-Assisting Device
Authors: Auwal Abubakar, Shazril Imran Shaukat, Noor Khairiah A. Karim, Mohammed Zakir Kassim, Gokula Kumar Appalanaido, Hafiz Mohd Zin
Abstract:
Background: Voluntary deep inspiration breath-hold radiotherapy (vDIBH-RT) is an effective cardiac dose reduction technique during left breast radiotherapy. This study aimed to assess the accuracy of the implementation of the vDIBH technique among left breast cancer patients without the use of a special device such as a surface-guided imaging system. Methods: The vDIBH-RT technique was implemented among thirteen (13) left breast cancer patients at the Advanced Medical and Dental Institute (AMDI), Universiti Sains Malaysia. Breath-hold monitoring was performed based on breath-hold skin marks and laser light congruence observed on zoomed CCTV images from the control console during each delivery. The initial setup was verified using cone beam computed tomography (CBCT) during breath-hold. Each field was delivered using multiple beam segments to allow a delivery time of 20 seconds, which can be tolerated by patients in breath-hold. The data were analysed using an in-house developed MATLAB algorithm. PTV margin was computed based on van Herk's margin recipe. Results: The setup error analysed from CBCT shows that the population systematic error in lateral (x), longitudinal (y), and vertical (z) axes was 2.28 mm, 3.35 mm, and 3.10 mm, respectively. Based on the CBCT image guidance, the Planning target volume (PTV) margin that would be required for vDIBH-RT using CCTV/Laser monitoring technique is 7.77 mm, 10.85 mm, and 10.93 mm in x, y, and z axes, respectively. Conclusion: It is feasible to safely implement vDIBH-RT among left breast cancer patients without special equipment. The breath-hold monitoring technique is cost-effective, radiation-free, easy to implement, and allows real-time breath-hold monitoring.Keywords: vDIBH, cone beam computed tomography, radiotherapy, left breast cancer
Procedia PDF Downloads 582275 A Robust Digital Image Watermarking Against Geometrical Attack Based on Hybrid Scheme
Authors: M. Samadzadeh Mahabadi, J. Shanbehzadeh
Abstract:
This paper presents a hybrid digital image-watermarking scheme, which is robust against varieties of attacks and geometric distortions. The image content is represented by important feature points obtained by an image-texture-based adaptive Harris corner detector. These feature points are extracted from LL2 of 2-D discrete wavelet transform which are obtained by using the Harris-Laplacian detector. We calculate the Fourier transform of circular regions around these points. The amplitude of this transform is rotation invariant. The experimental results demonstrate the robustness of the proposed method against the geometric distortions and various common image processing operations such as JPEG compression, colour reduction, Gaussian filtering, median filtering, and rotation.Keywords: digital watermarking, geometric distortions, geometrical attack, Harris Laplace, important feature points, rotation, scale invariant feature
Procedia PDF Downloads 5022274 Accumulation of Heavy Metals in Safflower (Carthamus tinctorius L.)
Authors: Violina R. Angelova, Mariana N. Perifanova-Nemska, Galina P. Uzunova, Elitsa N. Kolentsova
Abstract:
Comparative research has been conducted to allow us to determine the accumulation of heavy metals (Pb, Zn and Cd) in the vegetative and reproductive organs of safflower, and to identify the possibility of its growth on soils contaminated by heavy metals and efficacy for phytoremediation. The experiment was performed on an agricultural field contaminated by the Non-Ferrous-Metal Works (MFMW) near Plovdiv, Bulgaria. The experimental plots were situated at different distances (0.1, 0.5, 2.0, and 15 km) from the source of pollution. The contents of heavy metals in plant materials (roots, stems, leaves, seeds) were determined. The quality of safflower oils (heavy metals and fatty acid composition) was also determined. The quantitative measurements were carried out with inductively-coupled plasma (ICP). Safflower is a plant that is tolerant to heavy metals and can be referred to the hyperaccumulators of lead and cadmium and the accumulators of zinc. The plant can be successfully used in the phytoremediation of heavy metal contaminated soils. The processing of safflower seeds into oil and the use of the obtained oil will greatly reduce the cost of phytoremediation.Keywords: heavy metals, accumulation, safflower, polluted soils, phytoremediation
Procedia PDF Downloads 2642273 ROCK Signaling and Radio Resistance: The Association and the Effect
Authors: P. Annapurna, Cecil Ross, Sudhir Krishna, Sweta Srivastava
Abstract:
Irradiation plays a pivotal role in cervical cancer treatment, however some tumors exhibit resistance to therapy while some exhibit relapse, due to better repair and enhanced resistance mechanisms operational in their cells. The present study aims to understand the signaling mechanism operational in resistance phenotype and in the present study we report the role of Rho GTPase associated protein kinase (ROCK) signaling in cervical carcinoma radio-resistance. ROCK signaling has been implicated in several tumor progressions and is important for DNA repair. Irradiation of spheroid cultures of SiHa cervical carcinoma derived cell line at 6Gy resulted in generation of resistant cells in vitro which had better clonogenic abilities and formed larger and more colonies, in soft agar colony formation assay, as compared to the non-irradiated cells. These cells also exhibited an enhanced motility phenotype. Cell cycle profiling showed the cells to be blocked in G2M phase with enhanced pCDC2 levels indicating onset of possible DNA repair mechanism. Notably, 3 days post-irradiation, irradiated cells showed increased ROCK2 translocation to the nucleus with enhanced protein expression as compared to the non-irradiated cells. Radio-sensitization of the resistant cells was enhanced using Y27632, an inhibitor to ROCK signaling. The treatment of resistant cells with Y27632 resulted in increased cell death upon further irradiation. This observation has been confirmed using inhibitory antibodies to ROCK1/2. Result show that both ROCK1/2 have a functional contribution in radiation resistance of cervical cancer cells derived from cell lines. Interestingly enrichment of stem like cells (Hoechst negative cells) was also observed upon irradiation and these cells were markedly sensitive to Y27632 treatment. Our results thus suggest the role of ROCK signaling in radio-resistance in cervical carcinoma. Further studies with human biopsies, mice models and mechanistic of ROCK signaling in the context of radio-resistance will clarify the role of this molecule further and allow for therapeutics development.Keywords: cervical carcinoma, radio-resistance, ROCK signaling, cancer treatment
Procedia PDF Downloads 3332272 Antioxidant Activities, Chemical Components, Physicochemical, and Sensory Characteristics of Kecombrang Tea (Etlingera elatior)
Authors: Rifda Naufalin, Nurul Latifasari, Siti Nuryanti, Muna Ridha Hanifah
Abstract:
Kecombrang is a Zingiberaceae plant which has antioxidant properties. The high antioxidant content in kecombrang flowers has the potential to be processed as a functional beverage raw material so that it can be used as an ingredient in making herbal teas. The purpose of this study was to determine the chemical components, physicochemistry, antioxidant activity and sensory characteristics of kecombrang tea. The research methodology was carried out by using a completely randomized design with processing factors of kecombrang tea namely blanching and non-blanching, fermentation and non-fermentation, and the optimal time for drying kecombrang tea. The best treatment combination based on the effective index method is the treatment of the blanching process followed by drying at a temperature of 50ᵒC until the 2% moisture content can produce kecombrang tea with a total phenol content of 5.95 mg Tannic Acid Equivalent (TAE) / gram db, total flavonoid 3%, pH 4.5, and antioxidant activity 82.95%, red color, distinctive aroma of tea, fresh taste, and preferred by panelists.Keywords: kecombrang tea, blanching, fermentation, total phenol, and antioxidant activity
Procedia PDF Downloads 1492271 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization
Authors: Agria Rhamdhan
Abstract:
WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.Keywords: forensics, triage, visualization, WhatsApp
Procedia PDF Downloads 1722270 Donoho-Stark’s and Hardy’s Uncertainty Principles for the Short-Time Quaternion Offset Linear Canonical Transform
Authors: Mohammad Younus Bhat
Abstract:
The quaternion offset linear canonical transform (QOLCT), which isa time-shifted and frequency-modulated version of the quaternion linear canonical transform (QLCT), provides a more general framework of most existing signal processing tools. For the generalized QOLCT, the classical Heisenberg’s and Lieb’s uncertainty principles have been studied recently. In this paper, we first define the short-time quaternion offset linear canonical transform (ST-QOLCT) and drive its relationship with the quaternion Fourier transform (QFT). The crux of the paper lies in the generalization of several well-known uncertainty principles for the ST-QOLCT, including Donoho-Stark’s uncertainty principle, Hardy’s uncertainty principle, Beurling’s uncertainty principle, and the logarithmic uncertainty principle.Keywords: Quaternion Fourier transform, Quaternion offset linear canonical transform, short-time quaternion offset linear canonical transform, uncertainty principle
Procedia PDF Downloads 2122269 Preparation of Melt Electrospun Polylactic Acid Nanofibers with Optimum Conditions
Authors: Amir Doustgani
Abstract:
Melt electrospinning is a safe and simple technique for the production of micro and nanofibers which can be an alternative to conventional solvent electrospinning. The effects of various melt-electrospinning parameters, including molecular weight, electric field strength, flow rate and temperature on the morphology and fiber diameter of polylactic acid were studied. It was shown that molecular weight was the predominant factor in determining the obtainable fiber diameter of the collected fibers. An orthogonal design was used to examine process parameters. Results showed that molecular weight is the most effective parameter on the average fiber diameter of melt electrospun PLA nanofibers and the flow rate has the less important impact. Mean fiber diameter increased by increasing MW and flow rate, but decreased by increasing electric field strength and temperature. MFD of optimized fibers was below 100 nm and the result of software was in good agreement with the experimental condition.Keywords: fiber formation, processing, spinning, melt blowing
Procedia PDF Downloads 4402268 Atmospheric Full Scale Testing of a Morphing Trailing Edge Flap System for Wind Turbine Blades
Authors: Thanasis K. Barlas, Helge A. Madsen
Abstract:
A novel Active Flap System (AFS) has been developed at DTU Wind Energy, as a result of a 3-year R\&D project following almost 10 years of innovative research in this field. The full-scale AFS comprises an active deformable trailing edge has been tested at the unique rotating test facility at the Risoe Campus of DTU Wind Energy in Denmark. The design and instrumentation of the wing section and the active flap system (AFS) are described. The general description and objectives of the rotating test rig at the Risoe campus of DTU are presented, as used for the aeroelastic testing of the AFS in the recently finalized INDUFLAP project. The general description and objectives are presented, along with an overview of sensors on the setup and the test cases. The post-processing of data is discussed and results of steady flap step and azimuth control flap cases are presented.Keywords: morphing, adaptive, flap, smart blade, wind turbine
Procedia PDF Downloads 3992267 Applying the Eye Tracking Technique for the Evaluation of Oculomotor System in Patients Survived after Cerebellar Tumors
Authors: Marina Shurupova, Victor Anisimov, Alexander Latanov
Abstract:
Background: The cerebellar lesions inevitably provoke oculomotor impairments in patients of different age. Symptoms of subtentorial tumors, particularly medulloblastomas, include static and dynamic coordination disorders (ataxia, asynergia, imbalance), hypo-muscle tonus, disruption of the cranial nerves, and within the oculomotor system - nystagmus (fine or gross). Subtentorial tumors can also affect the areas of cerebellum that control the oculomotor system. The noninvasive eye-tracking technology allows obtaining multiple oculomotor characteristics such as the number of fixations and their duration, amplitude, latency and velocity of saccades, trajectory and scan path of gaze during the process of the visual field navigation. Eye tracking could be very useful in clinical studies serving as convenient and effective tool for diagnostics. The aim: We studied the dynamics of oculomotor system functioning in patients undergoing remission from cerebellar tumors removal surgeries and following neurocognitive rehabilitation. Methods: 38 children (23 boys, 15 girls, 9-17 years old) that have recovered from the cerebellar tumor-removal surgeries, radiation therapy and chemotherapy and were undergoing course of neurocognitive rehabilitation participated in the study. Two tests were carried out to evaluate oculomotor performance - gaze stability test and counting test. The monocular eye movements were recorded with eye tracker ArringtonResearch (60 Hz). Two experimental sessions with both tests were conducted before and after rehabilitation courses. Results: Within the final session of both tests we observed remarkable improvement in oculomotor performance: 1) in the gaze stability test the spread of gaze positions significantly declined compared to the first session, and 2) the visual path in counting test significantly shortened both compared to the first session. Thus, neurocognitive rehabilitation improved the functioning of the oculomotor system in patients following the cerebellar tumor removal surgeries and subsequent therapy. Conclusions: The experimental data support the effectiveness of the utilization of the eye tracking technique as diagnostic tool in the field of neurooncology.Keywords: eye tracking, rehabilitation, cerebellar tumors, oculomotor system
Procedia PDF Downloads 1612266 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions
Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren
Abstract:
Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.Keywords: fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB
Procedia PDF Downloads 1482265 A Stable Method for Determination of the Number of Independent Components
Authors: Yuyan Yi, Jingyi Zheng, Nedret Billor
Abstract:
Independent component analysis (ICA) is one of the most commonly used blind source separation (BSS) techniques for signal pre-processing, such as noise reduction and feature extraction. The main parameter in the ICA method is the number of independent components (IC). Although there have been several methods for the determination of the number of ICs, it has not been given sufficient attentionto this important parameter. In this study, wereview the mostused methods fordetermining the number of ICs and providetheir advantages and disadvantages. Further, wepropose an improved version of column-wise ICAByBlock method for the determination of the number of ICs.To assess the performance of the proposed method, we compare the column-wise ICAbyBlock with several existing methods through different ICA methods by using simulated and real signal data. Results show that the proposed column-wise ICAbyBlock is an effective and stable method for determining the optimal number of components in ICA. This method is simple, and results can be demonstrated intuitively with good visualizations.Keywords: independent component analysis, optimal number, column-wise, correlation coefficient, cross-validation, ICAByblock
Procedia PDF Downloads 1002264 Possibilities of Postmortem CT to Detection of Gas Accumulations in the Vessels of Dead Newborns with Congenital Sepsis
Authors: Uliana N. Tumanova, Viacheslav M. Lyapin, Vladimir G. Bychenko, Alexandr I. Shchegolev, Gennady T. Sukhikh
Abstract:
It is well known that the gas formed as a result of postmortem decomposition of tissues can be detected already 24-48 hours after death. In addition, the conditions of keeping and storage of the corpse (temperature and humidity of the environment) significantly determine the rate of occurrence and development of posthumous changes. The presence of sepsis is accompanied by faster postmortem decomposition and decay of the organs and tissues of the body. The presence of gas in the vessels and cavities can be revealed fully at postmortem CT. Radiologists must certainly report on the detection of intraorganic or intravascular gas, wich was detected at postmortem CT, to forensic experts or pathologists before the autopsy. This gas can not be detected during autopsy, but it can be very important for establishing a diagnosis. To explore the possibility of postmortem CT for the evaluation of gas accumulations in the newborns' vessels, who died from congenital sepsis. Researched of 44 newborns bodies (25 male and 19 female sex, at the age from 6 hours to 27 days) after 6 - 12 hours of death. The bodies were stored in the refrigerator at a temperature of +4°C in the supine position. Grouped 12 bodies of newborns that died from congenital sepsis. The control group consisted of 32 bodies of newborns that died without signs of sepsis. Postmortem CT examination was performed at the GEMINI TF TOF16 device, before the autopsy. The localizations of gas accumulations in the vessels were determined on the CT tomograms. The sepsis diagnosis was on the basis of clinical and laboratory data and autopsy results. Gases in the vessels were detected in 33.3% of cases in the group with sepsis, and in the control group - in 34.4%. A group with sepsis most often the gas localized in the heart and liver vessels - 50% each, of observations number with the detected gas in the vessels. In the heart cavities, aorta and mesenteric vessels - 25% each. In control most often gas was detected in the liver (63.6%) and abdominal cavity (54.5%) vessels. In 45.5% the gas localized in the cavities, and in 36.4% in the vessels of the heart. In the cerebral vessels and in the aorta gas was detected in 27.3% and 9.1%, respectively. Postmortem CT has high diagnostic capabilities to detect free gas in vessels. Postmortem changes in newborns that died from sepsis do not affect intravascular gas production within 6-12 hours. Radiation methods should be used as a supplement to the autopsy, including as a kind of ‘guide’, with the indication to the forensic medical expert of certain changes identified during CT studies, for better definition of pathological processes during the autopsy. Postmortem CT can be recommend as a first stage of autopsy.Keywords: congenital sepsis, gas, newborn, postmortem CT
Procedia PDF Downloads 1472263 A LED Warning Vest as Safety Smart Textile and Active Cooperation in a Working Group for Building a Normative Standard
Authors: Werner Grommes
Abstract:
The institute of occupational safety and health works in a working group for building a normative standard for illuminated warning vests and did a lot of experiments and measurements as basic work (cooperation). Intelligent car headlamps are able to suppress conventional warning vests with retro-reflective stripes as a disturbing light. Illuminated warning vests are therefore required for occupational safety. However, they must not pose any danger to the wearer or other persons. Here, the risks of the batteries (lithium types), the maximum brightness (glare) and possible interference radiation from the electronics on the implant carrier must be taken into account. The all-around visibility, as well as the required range, play an important role here. For the study, many luminance measurements of already commercially available LEDs and electroluminescent warning vests, as well as their electromagnetic interference fields and aspects of electrical safety, were measured. The results of this study showed that LED lighting is all far too bright and causes strong glare. The integrated controls with pulse modulation and switching regulators cause electromagnetic interference fields. Rechargeable lithium batteries can explode depending on the temperature range. Electroluminescence brings even more hazards. A test method was developed for the evaluation of visibility at distances of 50, 100, and 150 m, including the interview of test persons. A measuring method was developed for the detection of glare effects at close range with the assignment of the maximum permissible luminance. The electromagnetic interference fields were tested in the time and frequency ranges. A risk and hazard analysis were prepared for the use of lithium batteries. The range of values for luminance and risk analysis for lithium batteries were discussed in the standards working group. These will be integrated into the standard. This paper gives a brief overview of the topics of illuminated warning vests, which takes into account the risks and hazards for the vest wearer or othersKeywords: illuminated warning vest, optical tests and measurements, risks, hazards, optical glare effects, LED, E-light, electric luminescent
Procedia PDF Downloads 1142262 Evaluation of Longitudinal Relaxation Time (T1) of Bone Marrow in Lumbar Vertebrae of Leukaemia Patients Undergoing Magnetic Resonance Imaging
Authors: M. G. R. S. Perera, B. S. Weerakoon, L. P. G. Sherminie, M. L. Jayatilake, R. D. Jayasinghe, W. Huang
Abstract:
The aim of this study was to measure and evaluate the Longitudinal Relaxation Times (T1) in bone marrow of an Acute Myeloid Leukaemia (AML) patient in order to explore the potential for a prognostic biomarker using Magnetic Resonance Imaging (MRI) which will be a non-invasive prognostic approach to AML. MR image data were collected in the DICOM format and MATLAB Simulink software was used in the image processing and data analysis. For quantitative MRI data analysis, Region of Interests (ROI) on multiple image slices were drawn encompassing vertebral bodies of L3, L4, and L5. T1 was evaluated using the T1 maps obtained. The estimated bone marrow mean value of T1 was 790.1 (ms) at 3T. However, the reported T1 value of healthy subjects is significantly (946.0 ms) higher than the present finding. This suggests that the T1 for bone marrow can be considered as a potential prognostic biomarker for AML patients.Keywords: acute myeloid leukaemia, longitudinal relaxation time, magnetic resonance imaging, prognostic biomarker.
Procedia PDF Downloads 5332261 Smart Surveillance with 5G: A Performance Study in Adama City
Authors: Shenko Chura Aredo, Hailu Belay, Kevin T. Kornegay
Abstract:
In light of Adama City’s smart city development vision, this study thoroughly investigates the performance of smart security systems with Fifth Generation (5G) network capabilities. It can be logistically difficult to install a lot of cabling, particularly in big or dynamic settings. Moreover, latency issues might affect linked systems, making it difficult for them to monitor in real time. Through a focused analysis that employs Adama City as a case study, the performance has been evaluated in terms of spectrum and energy efficiency using empirical data and basic signal processing formulations at different frequency resources. The findings also demonstrate that cameras working at higher 5G frequencies have more capacity than those operating at sub-6 GHz, notwithstanding frequency-related issues. It has also been noted that when the beams of such cameras are adaptively focussed based on the distance of the last cell edge user rather than the maximum cell radius, less energy is required than with conventional fixed power ramping.Keywords: 5G, energy efficiency, safety, smart security, spectral efficiency
Procedia PDF Downloads 242260 Study of Biodegradable Composite Materials Based on Polylactic Acid and Vegetal Reinforcements
Authors: Manel Hannachi, Mustapha Nechiche, Said Azem
Abstract:
This study focuses on biodegradable materials made from Poly-lactic acid (PLA) and vegetal reinforcements. Three materials are developed from PLA, as a matrix, and : (i) olive kernels (OK); (ii) alfa (α) short fibers and (iii) OK+ α mixture, as reinforcements. After processing of PLA pellets and olive kernels in powder and alfa stems in short fibers, three mixtures, namely PLA-OK, PLA-α, and PLA-OK-α are prepared and homogenized in Turbula®. These mixtures are then compacted at 180°C under 10 MPa during 15 mn. Scanning Electron Microscopy (SEM) examinations show that PLA matrix adheres at surface of all reinforcements and the dispersion of these ones in matrix is good. X-ray diffraction (XRD) analyses highlight an increase of PLA inter-reticular distances, especially for the PLA-OK case. These results are explained by the dissociation of some molecules derived from reinforcements followed by diffusion of the released atoms in the structure of PLA. This is consistent with Fourier Transform Infrared Spectroscopy (FTIR) and Differential Scanning Calorimetry (DSC) analysis results.Keywords: alfa short fibers, biodegradable composite, olive kernels, poly-lactic acid
Procedia PDF Downloads 1482259 A Chinese Nested Named Entity Recognition Model Based on Lexical Features
Abstract:
In the field of named entity recognition, most of the research has been conducted around simple entities. However, for nested named entities, which still contain entities within entities, it has been difficult to identify them accurately due to their boundary ambiguity. In this paper, a hierarchical recognition model is constructed based on the grammatical structure and semantic features of Chinese text for boundary calculation based on lexical features. The analysis is carried out at different levels in terms of granularity, semantics, and lexicality, respectively, avoiding repetitive work to reduce computational effort and using the semantic features of words to calculate the boundaries of entities to improve the accuracy of the recognition work. The results of the experiments carried out on web-based microblogging data show that the model achieves an accuracy of 86.33% and an F1 value of 89.27% in recognizing nested named entities, making up for the shortcomings of some previous recognition models and improving the efficiency of recognition of nested named entities.Keywords: coarse-grained, nested named entity, Chinese natural language processing, word embedding, T-SNE dimensionality reduction algorithm
Procedia PDF Downloads 1312258 Speckle-Based Phase Contrast Micro-Computed Tomography with Neural Network Reconstruction
Authors: Y. Zheng, M. Busi, A. F. Pedersen, M. A. Beltran, C. Gundlach
Abstract:
X-ray phase contrast imaging has shown to yield a better contrast compared to conventional attenuation X-ray imaging, especially for soft tissues in the medical imaging energy range. This can potentially lead to better diagnosis for patients. However, phase contrast imaging has mainly been performed using highly brilliant Synchrotron radiation, as it requires high coherence X-rays. Many research teams have demonstrated that it is also feasible using a laboratory source, bringing it one step closer to clinical use. Nevertheless, the requirement of fine gratings and high precision stepping motors when using a laboratory source prevents it from being widely used. Recently, a random phase object has been proposed as an analyzer. This method requires a much less robust experimental setup. However, previous studies were done using a particular X-ray source (liquid-metal jet micro-focus source) or high precision motors for stepping. We have been working on a much simpler setup with just small modification of a commercial bench-top micro-CT (computed tomography) scanner, by introducing a piece of sandpaper as the phase analyzer in front of the X-ray source. However, it needs a suitable algorithm for speckle tracking and 3D reconstructions. The precision and sensitivity of speckle tracking algorithm determine the resolution of the system, while the 3D reconstruction algorithm will affect the minimum number of projections required, thus limiting the temporal resolution. As phase contrast imaging methods usually require much longer exposure time than traditional absorption based X-ray imaging technologies, a dynamic phase contrast micro-CT with a high temporal resolution is particularly challenging. Different reconstruction methods, including neural network based techniques, will be evaluated in this project to increase the temporal resolution of the phase contrast micro-CT. A Monte Carlo ray tracing simulation (McXtrace) was used to generate a large dataset to train the neural network, in order to address the issue that neural networks require large amount of training data to get high-quality reconstructions.Keywords: micro-ct, neural networks, reconstruction, speckle-based x-ray phase contrast
Procedia PDF Downloads 2602257 Physico-Mechanical Behavior of Indian Oil Shales
Authors: K. S. Rao, Ankesh Kumar
Abstract:
The search for alternative energy sources to petroleum has increased these days because of increase in need and depletion of petroleum reserves. Therefore the importance of oil shales as an economically viable substitute has increased many folds in last 20 years. The technologies like hydro-fracturing have opened the field of oil extraction from these unconventional rocks. Oil shale is a compact laminated rock of sedimentary origin containing organic matter known as kerogen which yields oil when distilled. Oil shales are formed from the contemporaneous deposition of fine grained mineral debris and organic degradation products derived from the breakdown of biota. Conditions required for the formation of oil shales include abundant organic productivity, early development of anaerobic conditions, and a lack of destructive organisms. These rocks are not gown through the high temperature and high pressure conditions in Mother Nature. The most common approach for oil extraction is drastically breaking the bond of the organics which involves retorting process. The two approaches for retorting are surface retorting and in-situ processing. The most environmental friendly approach for extraction is In-situ processing. The three steps involved in this process are fracturing, injection to achieve communication, and fluid migration at the underground location. Upon heating (retorting) oil shale at temperatures in the range of 300 to 400°C, the kerogen decomposes into oil, gas and residual carbon in a process referred to as pyrolysis. Therefore it is very important to understand the physico-mechenical behavior of such rocks, to improve the technology for in-situ extraction. It is clear from the past research and the physical observations that these rocks will behave as an anisotropic rock so it is very important to understand the mechanical behavior under high pressure at different orientation angles for the economical use of these resources. By knowing the engineering behavior under above conditions will allow us to simulate the deep ground retorting conditions numerically and experimentally. Many researchers have investigate the effect of organic content on the engineering behavior of oil shale but the coupled effect of organic and inorganic matrix is yet to be analyzed. The favourable characteristics of Assam coal for conversion to liquid fuels have been known for a long time. Studies have indicated that these coals and carbonaceous shale constitute the principal source rocks that have generated the hydrocarbons produced from the region. Rock cores of the representative samples are collected by performing on site drilling, as coring in laboratory is very difficult due to its highly anisotropic nature. Different tests are performed to understand the petrology of these samples, further the chemical analyses are also done to exactly quantify the organic content in these rocks. The mechanical properties of these rocks are investigated by considering different anisotropic angles. Now the results obtained from petrology and chemical analysis are correlated with the mechanical properties. These properties and correlations will further help in increasing the producibility of these rocks. It is well established that the organic content is negatively correlated to tensile strength, compressive strength and modulus of elasticity.Keywords: oil shale, producibility, hydro-fracturing, kerogen, petrology, mechanical behavior
Procedia PDF Downloads 3482256 Features Vector Selection for the Recognition of the Fragmented Handwritten Numeric Chains
Authors: Salim Ouchtati, Aissa Belmeguenai, Mouldi Bedda
Abstract:
In this study, we propose an offline system for the recognition of the fragmented handwritten numeric chains. Firstly, we realized a recognition system of the isolated handwritten digits, in this part; the study is based mainly on the evaluation of neural network performances, trained with the gradient backpropagation algorithm. The used parameters to form the input vector of the neural network are extracted from the binary images of the isolated handwritten digit by several methods: the distribution sequence, sondes application, the Barr features, and the centered moments of the different projections and profiles. Secondly, the study is extended for the reading of the fragmented handwritten numeric chains constituted of a variable number of digits. The vertical projection was used to segment the numeric chain at isolated digits and every digit (or segment) was presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits).Keywords: features extraction, handwritten numeric chains, image processing, neural networks
Procedia PDF Downloads 2672255 Bioactivity of Peptides from Two Mushrooms
Authors: Parisa Farzaneh, Azade Harati
Abstract:
Mushrooms, or macro-fungi, as an important superfood, contain many bioactive compounds, particularly bio-peptides. In this research, mushroom proteins were extracted by buffer or buffer plus salt (0.15 M), along with an ultrasound bath to extract the intercellular protein. As a result, the highest amount of proteins in mushrooms were categorized into albumin. Proteins were also hydrolyzed and changed into peptides through endogenous and exogenous proteases, including gastrointestinal enzymes. The potency of endogenous proteases was also higher in Agaricus bisporus than Terfezia claveryi, as their activity ended at 75 for 15 min. The blanching process, endogenous enzymes, the mixture of gastrointestinal enzymes (pepsin-trypsin-α-chymotrypsin or trypsin- α-chymotrypsin) produced the different antioxidant and antibacterial hydrolysates. The peptide fractions produced with different cut-off ultrafilters also had various levels of radical scavenging, lipid peroxidation inhibition, and antibacterial activities. The bio-peptides with superior bioactivities (less than 3 kD of T. claveryi) were resistant to various environmental conditions (pH and temperatures). Therefore, they are good options to be added to nutraceutical and pharmaceutical preparations or functional foods, even during processing.Keywords: bio-peptide, mushrooms, gastrointestinal enzymes, bioactivity
Procedia PDF Downloads 612254 Smooth Second Order Nonsingular Terminal Sliding Mode Control for a 6 DOF Quadrotor UAV
Authors: V. Tabrizi, A. Vali, R. GHasemi, V. Behnamgol
Abstract:
In this article, a nonlinear model of an under actuated six degrees of freedom (6 DOF) quadrotor UAV is derived on the basis of the Newton-Euler formula. The derivation comprises determining equations of the motion of the quadrotor in three dimensions and approximating the actuation forces through the modeling of aerodynamic coefficients and electric motor dynamics. The robust nonlinear control strategy includes a smooth second order non-singular terminal sliding mode control which is applied to stabilizing this model. The control method is on the basis of super twisting algorithm for removing the chattering and producing smooth control signal. Also, nonsingular terminal sliding mode idea is used for introducing a nonlinear sliding variable that guarantees the finite time convergence in sliding phase. Simulation results show that the proposed algorithm is robust against uncertainty or disturbance and guarantees a fast and precise control signal.Keywords: quadrotor UAV, nonsingular terminal sliding mode, second order sliding mode t, electronics, control, signal processing
Procedia PDF Downloads 4432253 Research on Energy Field Intervening in Lost Space Renewal Strategy
Authors: Tianyue Wan
Abstract:
Lost space is the space that has not been used for a long time and is in decline, proposed by Roger Trancik. And in his book Finding Lost Space: Theories of Urban Design, the concept of lost space is defined as those anti-traditional spaces that are unpleasant, need to be redesigned, and have no benefit to the environment and users. They have no defined boundaries and do not connect the various landscape elements in a coherent way. With the rapid development of urbanization in China, the blind areas of urban renewal have become a chaotic lost space that is incompatible with the rapid development of urbanization. Therefore, lost space needs to be reconstructed urgently under the background of infill development and reduction planning in China. The formation of lost space is also an invisible division of social hierarchy. This paper tries to break down the social class division and the estrangement between people through the regeneration of lost space. Ultimately, it will enhance vitality, rebuild a sense of belonging, and create a continuous open public space for local people. Based on the concept of lost space and energy field, this paper clarifies the significance of the energy field in the lost space renovation. Then it introduces the energy field into lost space by using the magnetic field in physics as a prototype. The construction of the energy field is support by space theory, spatial morphology analysis theory, public communication theory, urban diversity theory and city image theory. Taking Wuhan’s Lingjiao Park of China as an example, this paper chooses the lost space on the west side of the park as the research object. According to the current situation of this site, the energy intervention strategies are proposed from four aspects: natural ecology, space rights, intangible cultural heritage and infrastructure configuration. And six specific lost space renewal methods are used in this work, including “riveting”, “breakthrough”, “radiation”, “inheritance”, “connection” and “intersection”. After the renovation, space will be re-introduced into the active crow. The integration of activities and space creates a sense of place, improve the walking experience, restores the vitality of the space, and provides a reference for the reconstruction of lost space in the city.Keywords: dynamic vitality intervention, lost space, space vitality, sense of place
Procedia PDF Downloads 113