Search results for: fluorescence techniques
4541 Detection the Ice Formation Processes Using Multiple High Order Ultrasonic Guided Wave Modes
Authors: Regina Rekuviene, Vykintas Samaitis, Liudas Mažeika, Audrius Jankauskas, Virginija Jankauskaitė, Laura Gegeckienė, Abdolali Sadaghiani, Shaghayegh Saeidiharzand
Abstract:
Icing brings significant damage to aviation and renewable energy installations. Air-conditioning, refrigeration, wind turbine blades, airplane and helicopter blades often suffer from icing phenomena, which cause severe energy losses and impair aerodynamic performance. The icing process is a complex phenomenon with many different causes and types. Icing mechanisms, distributions, and patterns are still relevant to research topics. The adhesion strength between ice and surfaces differs in different icing environments. This makes the task of anti-icing very challenging. The techniques for various icing environments must satisfy different demands and requirements (e.g., efficient, lightweight, low power consumption, low maintenance and manufacturing costs, reliable operation). It is noticeable that most methods are oriented toward a particular sector and adapting them to or suggesting them for other areas is quite problematic. These methods often use various technologies and have different specifications, sometimes with no clear indication of their efficiency. There are two major groups of anti-icing methods: passive and active. Active techniques have high efficiency but, at the same time, quite high energy consumption and require intervention in the structure’s design. It’s noticeable that vast majority of these methods require specific knowledge and personnel skills. The main effect of passive methods (ice-phobic, superhydrophobic surfaces) is to delay ice formation and growth or reduce the adhesion strength between the ice and the surface. These methods are time-consuming and depend on forecasting. They can be applied on small surfaces only for specific targets, and most are non-biodegradable (except for anti-freezing proteins). There is some quite promising information on ultrasonic ice mitigation methods that employ UGW (Ultrasonic Guided Wave). These methods are have the characteristics of low energy consumption, low cost, lightweight, and easy replacement and maintenance. However, fundamental knowledge of ultrasonic de-icing methodology is still limited. The objective of this work was to identify the ice formation processes and its progress by employing ultrasonic guided wave technique. Throughout this research, the universal set-up for acoustic measurement of ice formation in a real condition (temperature range from +240 C to -230 C) was developed. Ultrasonic measurements were performed by using high frequency 5 MHz transducers in a pitch-catch configuration. The selection of wave modes suitable for detection of ice formation phenomenon on copper metal surface was performed. Interaction between the selected wave modes and ice formation processes was investigated. It was found that selected wave modes are sensitive to temperature changes. It was demonstrated that proposed ultrasonic technique could be successfully used for the detection of ice layer formation on a metal surface.Keywords: ice formation processes, ultrasonic GW, detection of ice formation, ultrasonic testing
Procedia PDF Downloads 654540 Modeling Intelligent Threats: Case of Continuous Attacks on a Specific Target
Authors: Asma Ben Yaghlane, Mohamed Naceur Azaiez
Abstract:
In this paper, we treat a model that falls in the area of protecting targeted systems from intelligent threats including terrorism. We introduce the concept of system survivability, in the context of continuous attacks, as the probability that a system under attack will continue operation up to some fixed time t. We define a constant attack rate (CAR) process as an attack on a targeted system that follows an exponential distribution. We consider the superposition of several CAR processes. From the attacker side, we determine the optimal attack strategy that minimizes the system survivability. We also determine the optimal strengthening strategy that maximizes the system survivability under limited defensive resources. We use operations research techniques to identify optimal strategies of each antagonist. Our results may be used as interesting starting points to develop realistic protection strategies against intentional attacks.Keywords: CAR processes, defense/attack strategies, exponential failure, survivability
Procedia PDF Downloads 3964539 Sustainable Strategies for Post-Disaster Shelters: Case Study-Based Review and Future Prospects
Authors: Fangwen Ni, Hongpeng Xu
Abstract:
When disasters occur, it is important to provide temporary shelters to protect victims from their environment and to comfort them with privacy and dignity. However, the commonly used shelters like tents and shanties can not ensure a comfortable condition. Furthermore, the demand for more energy and less pollution has become a major challenge. Focusing on the sustainable of temporary shelters, this study intends to clarify the essential role of temporary shelters before the reconstruction work is done. The paper also identifies the main problems from three aspects including spatial layout, thermal comfort and utilization of passive technology. Moreover, it expounds the passive strategies of ecological design by case study and simulation. It is found that the living condition of shelters can be improved from the perspective of architectural space, ventilation theory and construction techniques. Regardless of being temporary, these shelters are crucial elements in emergency situations and should be taken more seriously.Keywords: architectural space, construction technique, sustainable strategy, temporary shelter
Procedia PDF Downloads 2734538 The Use of Degradation Measures to Design Reliability Test Plans
Authors: Stephen V. Crowder, Jonathan W. Lane
Abstract:
With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. In this work we present a case study involving an electronic component subject to degradation. The data, consisting of 42 degradation paths of cycles to failure, are first used to estimate a reliability function. Bootstrapping techniques are then used to perform power studies and develop a minimal reliability test plan for future production of this component.Keywords: degradation measure, time to failure distribution, bootstrap, computational science
Procedia PDF Downloads 5344537 Prediction of Endotracheal Tube Size in Children by Predicting Subglottic Diameter Using Ultrasonographic Measurement versus Traditional Formulas
Authors: Parul Jindal, Shubhi Singh, Priya Ramakrishnan, Shailender Raghuvanshi
Abstract:
Background: Knowledge of the influence of the age of the child on laryngeal dimensions is essential for all practitioners who are dealing with paediatric airway. Choosing the correct endotracheal tube (ETT) size is a crucial step in pediatric patients because a large-sized tube may cause complications like post-extubation stridor and subglottic stenosis. On the other hand with a smaller tube, there will be increased gas flow resistance, aspiration risk, poor ventilation, inaccurate monitoring of end-tidal gases and reintubation may also be required with a different size of the tracheal tube. Recent advancement in ultrasonography (USG) techniques should now allow for accurate and descriptive evaluation of pediatric airway. Aims and objectives: This study was planned to determine the accuracy of Ultrasonography (USG) to assess the appropriate ETT size and compare it with physical indices based formulae. Methods: After obtaining approval from Institute’s Ethical and Research committee, and parental written and informed consent, the study was conducted on 100 subjects of either sex between 12-60 months of age, undergoing various elective surgeries under general anesthesia requiring endotracheal intubation. The same experienced radiologist performed ultrasonography. The transverse diameter was measured at the level of cricoids cartilage by USG. After USG, general anesthesia was administered using standard techniques followed by the institute. An experienced anesthesiologist performed the endotracheal intubations with uncuffed endotracheal tube (Portex Tracheal Tube Smiths Medical India Pvt. Ltd.) with Murphy’s eye. He was unaware of the finding of the ultrasonography. The tracheal tube was considered best fit if air leak was satisfactory at 15-20 cm H₂O of airway pressure. The obtained values were compared with the values of endotracheal tube size calculated by ultrasonography, various age, height, weight-based formulas and diameter of right and left little finger. The correlation of the size of the endotracheal tube by different modalities was done and Pearson's correlation coefficient was obtained. The comparison of the mean size of the endotracheal tube by ultrasonography and by traditional formula was done by the Friedman’s test and Wilcoxon sign-rank test. Results: The predicted tube size was equal to best fit and best determined by ultrasonography (100%) followed by comparison to left little finger (98%) and right little finger (97%) and age-based formula (95%) followed by multivariate formula (83%) and body length (81%) formula. According to Pearson`s correlation, there was a moderate correlation of best fit endotracheal tube with endotracheal tube size by age-based formula (r=0.743), body length based formula (r=0.683), right little finger based formula (r=0.587), left little finger based formula (r=0.587) and multivariate formula (r=0.741). There was a strong correlation with ultrasonography (r=0.943). Ultrasonography was the most sensitive (100%) method of prediction followed by comparison to left (98%) and right (97%) little finger and age-based formula (95%), the multivariate formula had an even lesser sensitivity (83%) whereas body length based formula was least sensitive with a sensitivity of 78%. Conclusion: USG is a reliable method of estimation of subglottic diameter and for prediction of ETT size in children.Keywords: endotracheal intubation, pediatric airway, subglottic diameter, traditional formulas, ultrasonography
Procedia PDF Downloads 2414536 Multisensory Science, Technology, Engineering and Mathematics Learning: Combined Hands-on and Virtual Science for Distance Learners of Food Chemistry
Authors: Paulomi Polly Burey, Mark Lynch
Abstract:
It has been shown that laboratory activities can help cement understanding of theoretical concepts, but it is difficult to deliver such an activity to an online cohort and issues such as occupational health and safety in the students’ learning environment need to be considered. Chemistry, in particular, is one of the sciences where practical experience is beneficial for learning, however typical university experiments may not be suitable for the learning environment of a distance learner. Food provides an ideal medium for demonstrating chemical concepts, and along with a few simple physical and virtual tools provided by educators, analytical chemistry can be experienced by distance learners. Food chemistry experiments were designed to be carried out in a home-based environment that 1) Had sufficient scientific rigour and skill-building to reinforce theoretical concepts; 2) Were safe for use at home by university students and 3) Had the potential to enhance student learning by linking simple hands-on laboratory activities with high-level virtual science. Two main components of the resources were developed, a home laboratory experiment component, and a virtual laboratory component. For the home laboratory component, students were provided with laboratory kits, as well as a list of supplementary inexpensive chemical items that they could purchase from hardware stores and supermarkets. The experiments used were typical proximate analyses of food, as well as experiments focused on techniques such as spectrophotometry and chromatography. Written instructions for each experiment coupled with video laboratory demonstrations were used to train students on appropriate laboratory technique. Data that students collected in their home laboratory environment was collated across the class through shared documents, so that the group could carry out statistical analysis and experience a full laboratory experience from their own home. For the virtual laboratory component, students were able to view a laboratory safety induction and advised on good characteristics of a home laboratory space prior to carrying out their experiments. Following on from this activity, students observed laboratory demonstrations of the experimental series they would carry out in their learning environment. Finally, students were embedded in a virtual laboratory environment to experience complex chemical analyses with equipment that would be too costly and sensitive to be housed in their learning environment. To investigate the impact of the intervention, students were surveyed before and after the laboratory series to evaluate engagement and satisfaction with the course. Students were also assessed on their understanding of theoretical chemical concepts before and after the laboratory series to determine the impact on their learning. At the end of the intervention, focus groups were run to determine which aspects helped and hindered learning. It was found that the physical experiments helped students to understand laboratory technique, as well as methodology interpretation, particularly if they had not been in such a laboratory environment before. The virtual learning environment aided learning as it could be utilized for longer than a typical physical laboratory class, thus allowing further time on understanding techniques.Keywords: chemistry, food science, future pedagogy, STEM education
Procedia PDF Downloads 1704535 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 914534 Green Synthesis of Copper Oxide and Cobalt Oxide Nanoparticles Using Spinacia Oleracea Leaf Extract
Authors: Yameen Ahmed, Jamshid Hussain, Farman Ullah, Sohaib Asif
Abstract:
The investigation aims at the synthesis of copper oxide and cobalt oxide nanoparticles using Spinacia oleracea leaf extract. These nanoparticles have many properties and applications. They possess antimicrobial catalytic properties and also they can be used in energy storage materials, gas sensors, etc. The Spinacia oleracea leaf extract behaves as a reducing agent in nanoparticle synthesis. The plant extract was first prepared and then treated with copper and cobalt salt solutions to get the precipitate. The salt solutions used for this purpose are copper sulfate pentahydrate (CuSO₄.5H₂O) and cobalt chloride hexahydrate (CoCl₂.6H₂O). The UV-Vis, XRD, EDX, and SEM techniques are used to find the optical, structural, and morphological properties of copper oxide and cobalt oxide nanoparticles. The UV absorption peaks are at 326 nm and 506 nm for copper oxide and cobalt oxide nanoparticles.Keywords: cobalt oxide, copper oxide, green synthesis, nanoparticles
Procedia PDF Downloads 2154533 The Effect of Technology on Advanced Automotive Electronics
Authors: Abanob Nady Wasef Moawed
Abstract:
In more complicated systems, inclusive of automotive gearboxes, a rigorous remedy of the data is essential because there are several transferring elements (gears, bearings, shafts, and many others.), and in this way, there are numerous viable sources of mistakes and also noise. The fundamental goal of these elements are the detection of damage in car gearbox. The detection strategies used are the wavelet technique, the bispectrum, advanced filtering techniques (selective filtering) of vibrational alerts and mathematical morphology. Gearbox vibration assessments were achieved (gearboxes in proper circumstance and with defects) of a manufacturing line of a huge car assembler. The vibration indicators have acquired the use of five accelerometers in distinct positions of the sample. The effects acquired using the kurtosis, bispectrum, wavelet and mathematical morphology confirmed that it's far possible to identify the lifestyles of defects in automobile gearboxes.Keywords: 3D-shaped electronics, electronic components, thermoforming, component positioningautomotive gearbox, mathematical morphology, wavelet, bispectrum
Procedia PDF Downloads 354532 Method for Predicting the Deformation of a Swelling Clay of the Region of N’Gaous (Batna, in Algeria)
Authors: Ferrah F., Baheddi M.
Abstract:
This study relates to how water content in some clay soils affects their structure by increasing or decreasing the volume. These cyclic phenomena of swelling-shrinkage cause parasitic stresses in structures and at the foundation. These stresses create damage in buildings, highways, pavements, airports and structures lightly loaded. This study was conducted on soil from a site near the hospital of N'gaous (Batna), whose soil is at the origin of cracks in the filler walls of the hospital. After a few years of exploitation, and according to the findings of experts in subdivision of construction and urbanism (SUCH), cracks appeared just after the heavy rains that the region experienced in 1987. Our study shows the need to become aware of the importance of damages occasioned by swellings by adopting construction techniques to solve this problem. The study is to determine a methodology to take into account the effects of swelling in calculating long-term foundations.Keywords: clay, swelling, shrinkage, swelling pressure, compressibility
Procedia PDF Downloads 334531 Trace Network: A Probabilistic Relevant Pattern Recognition Approach to Attribution Trace Analysis
Authors: Jian Xu, Xiaochun Yun, Yongzheng Zhang, Yafei Sang, Zhenyu Cheng
Abstract:
Network attack prevention is a critical research area of information security. Network attack would be oppressed if attribution techniques are capable to trace back to the attackers after the hacking event. Therefore attributing these attacks to a particular identification becomes one of the important tasks when analysts attempt to differentiate and profile the attacker behind a piece of attack trace. To assist analysts in expose attackers behind the scenes, this paper researches on the connections between attribution traces and proposes probabilistic relevance based attribution patterns. This method facilitates the evaluation of the plausibility relevance between different traceable identifications. Furthermore, through analyzing the connections among traces, it could confirm the existence probability of a certain organization as well as discover its affinitive partners by the means of drawing relevance matrix from attribution traces.Keywords: attribution trace, probabilistic relevance, network attack, attacker identification
Procedia PDF Downloads 3684530 Analyzing the Evolution of Polythiophene Nanoparticles Optically, Structurally, and Morphologically as a Sers (Surface-Enhanced Raman Spectroscopy) Sensor Pb²⁺ Detection in River Water
Authors: Temesgen Geremew
Abstract:
This study investigates the evolution of polythiophene nanoparticles (PThNPs) as surface-enhanced Raman spectroscopy (SERS) sensors for Pb²⁺ detection in river water. We analyze the PThNPs' optical, structural, and morphological properties at different stages of their development to understand their SERS performance. Techniques like UV-Vis spectroscopy, Fourier-transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and scanning electron microscopy (SEM) are employed for characterization. The SERS sensitivity towards Pb²⁺ is evaluated by monitoring the peak intensity of a specific Raman band upon increasing metal ion concentration. The study aims to elucidate the relationship between the PThNPs' characteristics and their SERS efficiency for Pb²⁺ detection, paving the way for optimizing their design and fabrication for improved sensing performance in real-world environmental monitoring applications.Keywords: polythiophene, Pb2+, SERS, nanoparticles
Procedia PDF Downloads 584529 Generation of Quasi-Measurement Data for On-Line Process Data Analysis
Authors: Hyun-Woo Cho
Abstract:
For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.Keywords: data analysis, diagnosis, monitoring, process data, quality control
Procedia PDF Downloads 4844528 Attenuation Scale Calibration of an Optical Time Domain Reflectometer
Authors: Osama Terra, Hatem Hussein
Abstract:
Calibration of Optical Time Domain Reflectometer (OTDR) is crucial for the accurate determination of loss budget for long optical fiber links. In this paper, the calibration of the attenuation scale of an OTDR using two different techniques is discussed and implemented. The first technique is the external modulation method (EM). A setup is proposed to calibrate an OTDR over a dynamic range of around 15 dB based on the EM method. Afterwards, the OTDR is calibrated using two standard reference fibers (SRF). Both SRF are calibrated using cut-back technique; one of them is calibrated at our home institute (the National Institute of Standards – NIS) while the other at the National Physical Laboratory (NPL) of the United Kingdom to confirm our results. In addition, the parameters contributing the calibration uncertainty are thoroughly investigated. Although the EM method has several advantages over the SRF method, the uncertainties in the SRF method is found to surpass that of the EM method.Keywords: optical time domain reflectometer, fiber attenuation measurement, OTDR calibration, external source method
Procedia PDF Downloads 4664527 Optimization of the Transfer Molding Process by Implementation of Online Monitoring Techniques for Electronic Packages
Authors: Burcu Kaya, Jan-Martin Kaiser, Karl-Friedrich Becker, Tanja Braun, Klaus-Dieter Lang
Abstract:
Quality of the molded packages is strongly influenced by the process parameters of the transfer molding. To achieve a better package quality and a stable transfer molding process, it is necessary to understand the influence of the process parameters on the package quality. This work aims to comprehend the relationship between the process parameters, and to identify the optimum process parameters for the transfer molding process in order to achieve less voids and wire sweep. To achieve this, a DoE is executed for process optimization and a regression analysis is carried out. A systematic approach is represented to generate models which enable an estimation of the number of voids and wire sweep. Validation experiments are conducted to verify the model and the results are presented.Keywords: dielectric analysis, electronic packages, epoxy molding compounds, transfer molding process
Procedia PDF Downloads 3834526 Development of a Biomechanical Method for Ergonomic Evaluation: Comparison with Observational Methods
Authors: M. Zare, S. Biau, M. Corq, Y. Roquelaure
Abstract:
A wide variety of observational methods have been developed to evaluate the ergonomic workloads in manufacturing. However, the precision and accuracy of these methods remain a subject of debate. The aims of this study were to develop biomechanical methods to evaluate ergonomic workloads and to compare them with observational methods. Two observational methods, i.e. SCANIA Ergonomic Standard (SES) and Rapid Upper Limb Assessment (RULA), were used to assess ergonomic workloads at two simulated workstations. They included four tasks such as tightening & loosening, attachment of tubes and strapping as well as other actions. Sensors were also used to measure biomechanical data (Inclinometers, Accelerometers, and Goniometers). Our findings showed that in assessment of some risk factors both RULA & SES were in agreement with the results of biomechanical methods. However, there was disagreement on neck and wrist postures. In conclusion, the biomechanical approach was more precise than observational methods, but some risk factors evaluated with observational methods were not measurable with the biomechanical techniques developed.Keywords: ergonomic, observational method, biomechanical methods, workload
Procedia PDF Downloads 3894525 The Application of Simulation Techniques to Enhance Nitroglycerin Production Efficiency: A Case Study of the Military Explosive Factory in Nakhon Sawan Province
Authors: Jeerasak Wisatphan, Nara Samattapapong
Abstract:
This study's goals were to enhance nitroglycerin manufacturing efficiency through simulation, recover nitroglycerin from the storage facility, and enhance nitroglycerine recovery and purge systems. It was found that the problem was nitroglycerin reflux. Therefore, the researcher created three alternatives to solve the problem. The system of Nitroglycerine Recovery and Purge was then simulated using the FlexSim program, and each alternative was tested. The results demonstrate that the alternative system-led Nitroglycerine Recovery and Nitroglycerine Purge System collaborate to produce Nitroglycerine, which is more efficient than other alternatives and can reduce production time. It can also improve the recovery of nitroglycerin. It also serves as a guideline for developing a real-world system and modeling it for training staff without wasting raw chemical materials or fuel energy.Keywords: efficiency increase, nitroglycerine recovery and purge system, production improvement, simulation
Procedia PDF Downloads 1314524 Digital Forensics Showdown: Encase and FTK Head-to-Head
Authors: Rida Nasir, Waseem Iqbal
Abstract:
Due to the constant revolution in technology and the increase in anti-forensic techniques used by attackers to remove their traces, professionals often struggle to choose the best tool to be used in digital forensic investigations. This paper compares two of the most well-known and widely used licensed commercial tools, i.e., Encase & FTK. The comparison was drawn on various parameters and features to provide an authentic evaluation of licensed versions of these well-known commercial tools against various real-world scenarios. In order to discover the popularity of these tools within the digital forensic community, a survey was conducted publicly to determine the preferred choice. The dataset used is the Computer Forensics Reference Dataset (CFReDS). A total of 70 features were selected from various categories. Upon comparison, both FTK and EnCase produce remarkable results. However, each tool has some limitations, and none of the tools is declared best. The comparison drawn is completely unbiased, based on factual data.Keywords: digital forensics, commercial tools, investigation, forensic evaluation
Procedia PDF Downloads 234523 Hybrid Obfuscation Technique for Reverse Engineering Problem
Authors: Asma’a Mahfoud, Abu Bakar Md. Sultan, Abdul Azim Abd, Norhayati Mohd Ali, Novia Admodisastro
Abstract:
Obfuscation is a practice to make something difficult and complicated. Programming code is ordinarily obfuscated to protect the intellectual property (IP) and prevent the attacker from reverse engineering (RE) a copyrighted software program. Obfuscation may involve encrypting some or all the code, transforming out potentially revealing data, renaming useful classes and variables (identifiers) names to meaningless labels, or adding unused or meaningless code to an application binary. Obfuscation techniques were not performing effectively recently as the reversing tools are able to break the obfuscated code. We propose in this paper a hybrid obfuscation technique that contains three approaches of renaming. Experimentation was conducted to test the effectiveness of the proposed technique. The experimentation has presented a promising result, where the reversing tools were not able to read the code.Keywords: intellectual property, obfuscation, software security, reverse engineering
Procedia PDF Downloads 1494522 Automatic Classification for the Degree of Disc Narrowing from X-Ray Images Using CNN
Authors: Kwangmin Joo
Abstract:
Automatic detection of lumbar vertebrae and classification method is proposed for evaluating the degree of disc narrowing. Prior to classification, deep learning based segmentation is applied to detect individual lumbar vertebra. M-net is applied to segment five lumbar vertebrae and fine-tuning segmentation is employed to improve the accuracy of segmentation. Using the features extracted from previous step, clustering technique, k-means clustering, is applied to estimate the degree of disc space narrowing under four grade scoring system. As preliminary study, techniques proposed in this research could help building an automatic scoring system to diagnose the severity of disc narrowing from X-ray images.Keywords: Disc space narrowing, Degenerative disc disorders, Deep learning based segmentation, Clustering technique
Procedia PDF Downloads 1264521 Joint Discrete Hartley Transform-Clipping for Peak to Average Power Ratio Reduction in Orthogonal Frequency Division Multiplexing System
Authors: Selcuk Comlekci, Mohammed Aboajmaa
Abstract:
Orthogonal frequency division multiplexing (OFDM) is promising technique for the modern wireless communications systems due to its robustness against multipath environment. The high peak to average power ratio (PAPR) of the transmitted signal is one of the major drawbacks of OFDM system, PAPR degrade the performance of bit error rate (BER) and effect on the linear characteristics of high power amplifier (HPA). In this paper, we proposed DHT-Clipping reduction technique to reduce the high PAPR by the combination between discrete Hartley transform (DHT) and Clipping techniques. From the simulation results, we notified that DHT-Clipping technique offers better PAPR reduction than DHT and Clipping, as well as DHT-Clipping introduce improved BER performance better than clipping.Keywords: ISI, cyclic prefix, BER, PAPR, HPA, DHT, subcarrier
Procedia PDF Downloads 4394520 Application of Lean Manufacturing in Brake Shoe Manufacturing Plant: A Case Study
Authors: Anees K. Ahamed, Aakash Kumar R. G., Raj M. Mohan
Abstract:
The main objective is to apply lean tools to identify and eliminate waste in and among the work stations so as to improve the process speed and quality. From the top seven wastes in the lean concept, we consider the movement of materials, defects, and inventory for the improvement since these cause the major impact on the performance measures. The layout was improved to reduce the movement of materials. It also quantifies the reduction in movement among the work stations. Value stream mapping has been used for identification of waste. Cause and effect diagram and 5W analysis are used to identify the reasons for defects and to provide the counter measures. Some cycle time reduction techniques also proposed to improve the productivity. Lean Audit check sheet was also used to identify the current position of the industry and to identify the gap to make the industry Lean.Keywords: cause and effect diagram, cycle time reduction, defects, lean, waste reduction
Procedia PDF Downloads 3874519 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients
Authors: Karina Zaccari, Ernesto Cordeiro Marujo
Abstract:
This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.Keywords: machine learning, medical diagnosis, meningitis detection, pediatric research
Procedia PDF Downloads 1524518 Mean Monthly Rainfall Prediction at Benina Station Using Artificial Neural Networks
Authors: Hasan G. Elmazoghi, Aisha I. Alzayani, Lubna S. Bentaher
Abstract:
Rainfall is a highly non-linear phenomena, which requires application of powerful supervised data mining techniques for its accurate prediction. In this study the Artificial Neural Network (ANN) technique is used to predict the mean monthly historical rainfall data collected from BENINA station in Benghazi for 31 years, the period of “1977-2006” and the results are compared against the observed values. The specific objective to achieve this goal was to determine the best combination of weather variables to be used as inputs for the ANN model. Several statistical parameters were calculated and an uncertainty analysis for the results is also presented. The best ANN model is then applied to the data of one year (2007) as a case study in order to evaluate the performance of the model. Simulation results reveal that application of ANN technique is promising and can provide reliable estimates of rainfall.Keywords: neural networks, rainfall, prediction, climatic variables
Procedia PDF Downloads 4904517 Disability in the Course of a Chronic Disease: The Example of People Living with Multiple Sclerosis in Poland
Authors: Milena Trojanowska
Abstract:
Disability is a phenomenon for which meanings and definitions have evolved over the decades. This became the trigger to start a project to answer the question of what disability constitutes in the course of an incurable chronic disease. The chosen research group are people living with multiple sclerosis.The contextual phase of the research was participant observation at the Polish Multiple Sclerosis Society, the largest NGO in Poland supporting people living with MS and their relatives. The research techniques used in the project are (in order of implementation): group interviews with people living with MS and their relatives, narrative interviews, asynchronous technique, participant observation during events organised for people living with MS and their relatives.The researcher is currently conducting follow-up interviews, as inaccuracies in the respondents' narratives were identified during the data analysis. Interviews and supplementary research techniques were used over the four years of the research, and the researcher also benefited from experience gained from 12 years of working with NGOs (diaries, notes). The research was carried out in Poland with the participation of people living in this country only.The research has been based on grounded theory methodology in a constructivist perspectivedeveloped by Kathy Charmaz. The goal was to follow the idea that research must be reliable, original, and useful. The aim was to construct an interpretive theory that assumes temporality and the processualityof social life. TheAtlas.ti software was used to collect research material and analyse it. It is a program from the CAQDAS(Computer-Assisted Qualitative Data Analysis Software) group.Several key factors influencing the construction of a disability identity by people living with multiple sclerosis was identified:-course of interaction with significant relatives,- the expectation of identification with disability (expressed by close relatives),- economic profitability (pension, allowances),- institutional advantages (e.g. parking card),- independence and autonomy (not equated with physical condition, but access to adapted infrastructure and resources to support daily functioning),- the way a person with MS construes the meaning of disability,- physical and mental state,- medical diagnosis of illness.In addition, it has been shown that making an assumption about the experience of disability in the course of MS is a form of cognitive reductionism leading to further phenomenon such as: the expectation of the person with MS to construct a social identity as a person with a disability (e.g. giving up work), the occurrence of institutional inequalities. It can also be a determinant of the choice of a life strategy that limits social and individual functioning, even if this necessity is not influenced by the person's physical or psychological condition.The results of the research are important for the development of knowledge about the phenomenon of disability. It indicates the contextuality and complexity of the disability phenomenon, which in the light of the research is a set of different phenomenon of heterogeneous nature and multifaceted causality. This knowledge can also be useful for institutions and organisations in the non-governmental sector supporting people with disabilities and people living with multiple sclerosis.Keywords: disability, multiple sclerosis, grounded theory, poland
Procedia PDF Downloads 1084516 Cognition Technique for Developing a World Music
Authors: Haider Javed Uppal, Javed Yunas Uppal
Abstract:
In today's globalized world, it is necessary to develop a form of music that is able to evoke equal emotional responses among people from diverse cultural backgrounds. Indigenous cultures throughout history have developed their own music cognition, specifically in terms of the connections between music and mood. With the advancements in artificial intelligence technologies, it has become possible to analyze and categorize music features such as timbre, harmony, melody, and rhythm and relate them to the resulting mood effects experienced by listeners. This paper presents a model that utilizes a screenshot translator to convert music from different origins into waveforms, which are then analyzed using machine learning and information retrieval techniques. By connecting these waveforms with Thayer's matrix of moods, a mood classifier has been developed using fuzzy logic algorithms to determine the emotional impact of different types of music on listeners from various cultures.Keywords: cognition, world music, artificial intelligence, Thayer’s matrix
Procedia PDF Downloads 824515 Recombination Center Levels in Gold and Platinum Doped N-type Silicon for High-Speed Thyristor
Authors: Nam Chol Yu, GyongIl Chu, HoJong Ri
Abstract:
Using DLTS (Deep-level transient spectroscopy) measurement techniques, we determined the dominant recombination center levels (defects of both A and B) in gold and platinum doped n-type silicon. Also, the injection and temperature dependence of the Shockley-Read-Hall (SRH) carrier lifetime was studied under low-level injection and high-level injection. Here measurements show that the dominant level under low-level injection located at EC-0.25 eV (A) correlated to the Pt+G1 and the dominant level under high-level injection located at EC-0.54 eV (B) correlated to the Au+G4. Finally, A and B are the same dominant levels for controlling the lifetime in gold-platinum doped n-silicon.Keywords: recombination center level, lifetime, carrier lifetime control, Gold, Platinum, Silicon
Procedia PDF Downloads 714514 Design, Development by Functional Analysis in UML and Static Test of a Multimedia Voice and Video Communication Platform on IP for a Use Adapted to the Context of Local Businesses in Lubumbashi
Authors: Blaise Fyama, Elie Museng, Grace Mukoma
Abstract:
In this article we present a java implementation of video telephony using the SIP protocol (Session Initiation Protocol). After a functional analysis of the SIP protocol, we relied on the work of Italian researchers of University of Parma-Italy to acquire adequate libraries for the development of our own communication tool. In order to optimize the code and improve the prototype, we used, in an incremental approach, test techniques based on a static analysis based on the evaluation of the complexity of the software with the application of metrics and the number cyclomatic of Mccabe. The objective is to promote the emergence of local start-ups producing IP video in a well understood local context. We have arrived at the creation of a video telephony tool whose code is optimized.Keywords: static analysis, coding complexity metric mccabe, Sip, uml
Procedia PDF Downloads 1214513 Gamification: A Guideline to Design an Effective E-Learning
Authors: Rattama Rattanawongsa
Abstract:
As technologies continue to develop and evolve, online learning has become one of the most popular ways of gaining access to learning. Worldwide, many students are engaging in both online and blended courses in growing numbers through e-learning. However, online learning is a form of teaching that has many benefits for learners but still has some limitations. The high attrition rates of students tend to be due to lack of motivation to succeed. Gamification is the use of game design techniques, game thinking and game mechanics in non-game context, such as learning. The gamifying method can motivate students to learn with fun and inspire them to continue learning. This paper aims to describe how the gamification work in the context of learning. The first part of this paper present the concept of gamification. The second part is described the psychological perspectives of gamification, especially motivation and flow theory for gamifying design. The result from this study will be described into the guidelines for effective learning design using a gamification concept.Keywords: gamification, e-learning, motivation, flow theory
Procedia PDF Downloads 5254512 Forecasting the Temperature at a Weather Station Using Deep Neural Networks
Authors: Debneil Saha Roy
Abstract:
Weather forecasting is a complex topic and is well suited for analysis by deep learning approaches. With the wide availability of weather observation data nowadays, these approaches can be utilized to identify immediate comparisons between historical weather forecasts and current observations. This work explores the application of deep learning techniques to weather forecasting in order to accurately predict the weather over a given forecast horizon. Three deep neural networks are used in this study, namely, Multi-Layer Perceptron (MLP), Long Short Tunn Memory Network (LSTM) and a combination of Convolutional Neural Network (CNN) and LSTM. The predictive performance of these models is compared using two evaluation metrics. The results show that forecasting accuracy increases with an increase in the complexity of deep neural networks.Keywords: convolutional neural network, deep learning, long short term memory, multi-layer perceptron
Procedia PDF Downloads 178