Search results for: user identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5003

Search results for: user identification

1943 Uncovering Hidden Bugs: An Exploratory Approach

Authors: Sagar Jitendra Mahendrakar

Abstract:

Exploratory testing is a dynamic and adaptable method of software quality assurance that is frequently praised for its ability to find hidden flaws and improve the overall quality of the product. Instead of using preset test cases, exploratory testing allows testers to explore the software application dynamically. This is in contrast to scripted testing methodologies, which primarily rely on tester intuition, creativity, and adaptability. There are several tools and techniques that can aid testers in the exploratory testing process which we will be discussing in this talk.Tests of this kind are able to find bugs of this kind that are harder to find during structured testing or that other testing methods may have overlooked.The purpose of this abstract is to examine the nature and importance of exploratory testing in modern software development methods. It explores the fundamental ideas of exploratory testing, highlighting the value of domain knowledge and tester experience in spotting possible problems that may escape the notice of traditional testing methodologies. Throughout the software development lifecycle, exploratory testing promotes quick feedback loops and continuous improvement by giving testers the ability to make decisions in real time based on their observations. This abstract also clarifies the unique features of exploratory testing, like its non-linearity and capacity to replicate user behavior in real-world settings. Testers can find intricate bugs, usability problems, and edge cases in software through impromptu exploration that might go undetected. Exploratory testing's flexible and iterative structure fits in well with agile and DevOps processes, allowing for a quicker time to market without sacrificing the quality of the final product.

Keywords: exploratory, testing, automation, quality

Procedia PDF Downloads 41
1942 Degradation of Acetaminophen with Fe3O4 and Fe2+ as Activator of Peroxymonosulfate

Authors: Chaoqun Tan, Naiyun Gao, Xiaoyan Xin

Abstract:

Perxymonosulfate (PMS)-based oxidation processes, as an alternative of hydrogen peroxide-based oxidation processes, are more and more popular because of reactive radical species (SO4-•, OH•) produced in systems. Magnetic nano-scaled particles Fe3O4 and ferrous anion (Fe2+) were studied for the activation of PMS for degradation of acetaminophen (APAP) in water. The Fe3O4 MNPs were found to effectively catalyze PMS for APAP and the reactions well followed a pseudo-first-order kinetics pattern (R2 > 0.95), while the degradation of APAP in PMS-Fe2+ system proceeds through two stages: a fast stage and a much slower stage. Within 5 min, approximately 7% and 18% of 10 ppm APAP was accomplished by 0.2 mM PMS in Fe3O4 (0.8g/L) and Fe2+ (0.1mM) activation process. However, as reaction proceed to 120 min, approximately 75% and 35% of APAP was removed in Fe3O4 activation process and Fe2+ activation process, respectively. Within 120 min, the mineralization of APAP was about 7.5% and 5.0% (initial APAP of 10 ppm and [PMS]0 of 0.2 mM) in Fe3O4-PMS and Fe2+-PMS system, while the mineralization could be greatly increased to about 31% and 40% as [PMS]0 increased to 2.0 mM in in Fe3O4-PMS and Fe2+-PMS system, respectively. At last, the production of reactive radical species were validated directly from Electron Paramagnetic Resonance (ESR) tests with 0.1 M 5,5-dimethyl-1-pyrrolidine N-oxide (DMPO). Plausible mechanisms on the radical generation from Fe3O4 and Fe2+ activation of PMS are proposed on the results of radial identification tests. The results demonstrated that Fe3O4 MNPs activated PMS and Fe2+ anion activated PMS systems are promising technologies for water pollution caused by contaminants such as pharmaceutical. Fe3O4-PMS system is more suitable for slowly remediation, while Fe2+-PMS system is more suitable for fast remediation.

Keywords: acetaminophen, peroxymonosulfate, radicals, Fe3O4

Procedia PDF Downloads 249
1941 TimeTune: Personalized Study Plans Generation with Google Calendar Integration

Authors: Chevon Fernando, Banuka Athuraliya

Abstract:

The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.

Keywords: personalized learning, study planner, time management, calendar integration

Procedia PDF Downloads 39
1940 Encoded Fiber Optic Sensors for Simultaneous Multipoint Sensing

Authors: C. Babu Rao, Pandian Chelliah

Abstract:

Owing to their reliability, a number of fluorescent spectra based fiber optic sensors have been developed for detection and identification of hazardous chemicals such as explosives, narcotics etc. In High security regions, such as airports, it is important to monitor simultaneously multiple locations. This calls for deployment of a portable sensor at each location. However, the selectivity and sensitivity of these techniques depends on the spectral resolution of the spectral analyzer. The better the resolution the larger the repertoire of chemicals that can be detected. A portable unit will have limitations in meeting these requirements. Optical fibers can be employed for collecting and transmitting spectral signal from the portable sensor head to a sensitive central spectral analyzer (CSA). For multipoint sensing, optical multiplexing of multiple sensor heads with CSA has to be adopted. However with multiplexing, when one sensor head is connected to CSA, the rest may remain unconnected for the turn-around period. The larger the number of sensor heads the larger this turn-around time will be. To circumvent this imitation, we propose in this paper, an optical encoding methodology to use multiple portable sensor heads connected to a single CSA. Each portable sensor head is assigned an unique address. Spectra of every chemical detected through this sensor head, are encoded by its unique address and can be identified at the CSA end. The methodology proposed is demonstrated through a simulation using Matlab SIMULINK.

Keywords: optical encoding, fluorescence, multipoint sensing

Procedia PDF Downloads 706
1939 The Role of DNA Evidence in Determining Paternity in India: A Study of Cases from the Legal and Scientific Perspective

Authors: Pratyusha Das

Abstract:

A paradigm shift has been noticed in the interpretation of DNA evidence for determining paternity. Sometimes DNA evidence has been accepted while sometimes it was rejected by the Indian Courts. Courts have forwarded various justifications for acceptance and rejection of such evidence through legal and scientific means. Laws have also been changed to accommodate the necessities of society. Balances between both the legal and scientific approaches are required, to make the best possible use of DNA evidence for the well-being of the society. Specifications are to be framed as to when such evidence can be used in the future by pointing out the pros and cons. Judicial trend is to be formulated to find out the present situation. The study of cases of superior courts of India using an analytical and theoretical approach is driving the questions regarding the shared identity of the legal and scientific approaches. To assimilate the differences between the two approaches, the basic differences between them have to be formulated. Revelations are required to access the favorable decisions using the DNA evidence. Reasons are to be forwarded for the unfavorable decisions and the approach preferred in such cases. The outcome of the two methods has to be assessed in relation to the parties to the dispute, the society at large, the researcher and from the judicial point of view. The dependability of the two methods is to be studied in relation to the justice delivery system. A highlight of the chronological study of cases along with the changes in the laws with the aid of presumptions will address the questions of necessity of a method according to the facts and situations. Address is required in this respect whether the legal and scientific forces converge somewhere pushing the traditional identification of paternity towards a fundamental change.

Keywords: cases, evidence, legal, scientific

Procedia PDF Downloads 239
1938 Effect of BaO-Bi₂O₃-P₂O₅ Glass Additive on Structural and Dielectric Properties of BaTiO₃ Ceramics

Authors: El Mehdi Haily, Lahcen Bih, Mohammed Azrour, Bouchaib Manoun

Abstract:

The effects of xBi₂O₃-yBaO-zP₂O₅ (BBP) glass addition on the sintering, structural, and dielectric properties of BaTiO₃ ceramic (BT) are studied. The BT ceramic was synthesized by the conventional solid-state reaction method while the glasses BaO-Bi₂O₃-P₂O₅ (BBP) were elaborated by melting and quenching process. Different composites BT-xBBP were formed by mixing the BBP glasses with BT ceramic. For each glass composition, where the ratio (x:y:z) is maintained constant, we have developed three composites with different glass weight percentage (x = 2.5, 5, and 7.5 wt %). Addition of the glass helps in better sintering at lower temperatures with the presence of liquid phase at the respective sintering temperatures. The results showed that the sintering temperature decreased from more than 1300°C to 900°C. Density measurements of the composites are performed using the standard Archimedean method with water as medium liquid. It is found that their density and molar volume decrease and increase with glass content, respectively. Raman spectroscopy is used to characterize their structural approach. This technique has allowed the identification of different structural units of phosphate and the characteristic vibration modes of the BT. The electrical properties of the composite samples are carried out by impedance spectroscopy in the frequency range of 10 Hz to 1 MHz under various temperatures from 300 to 473 K. The obtained results show that their dielectric properties depend both on the content of the glass in the composite and the Bi/P ratio in the glasses.

Keywords: phosphate, glasses, composite, Raman spectroscopy, dielectric properties

Procedia PDF Downloads 157
1937 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter

Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal

Abstract:

Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.

Keywords: air, component-specific toxicity, human health risks, particulate matter

Procedia PDF Downloads 304
1936 Separation, Identification, and Measuring Gossypol in the Cottonseed Oil and Investigating the Performance of Drugs Prepared from the Combination of Plant Extract and Oil in the Treatment of Cutaneous Leishmaniasis Resistant to Drugs

Authors: Sara Taghdisi, M. Mirmohammadi, M. Mokhtarian

Abstract:

In 2013, the World Health Organization announced the cases of Cutaneous leishmaniasis infection in Iran between 69,000 to 113,000. The most common chemical drugs for Cutaneous leishmaniasis treatment are sodium stibogluconate, and meglumine antimonate, which not only have relatively many side effects, but also some species of the Leishmania genus have become resistant to them .The most prominent compound existing in different parts of the cotton plant is a yellow polyphenol called Gossypol. Gossypol is an extremely valuable compound and has anti-cancer properties. In the current project, Gossypol was extracted with a liquid-liquid extraction method in 120 minutes in the presence of Phosphoric acid from the cotton seed oil of Golestan beach varieties, then got crystallized in darkness using Acetic acid and isolated as Gossypol Acetic acid. The efficiency of the extracted crystal was obtained at 0.12+- 1.28. the cotton plant could be efficient in the treatment of Cutaneous leishmaniasis. The extract of the green-leaf cotton boll of Jargoyeh varieties was tested as an ointment on the target group of patients suffering from Cutaneous leishmaniasis resistant to drugs esistant to drugs by our colleagues in the research team. The results showed the Pearson's correlation coefficient of 0.72 between the two variables of wound diameter and the extract use over time which indicated the positive effect of this extract on the treatment of Cutaneous leishmaniasis was resistant to drugs.

Keywords: cottonseed oil, crystallization, gossypol, green-leaf

Procedia PDF Downloads 101
1935 Development of an Integrated Route Information Management Software

Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel

Abstract:

The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.

Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying

Procedia PDF Downloads 139
1934 Korean Smart Cities: Strategic Foci, Characteristics and Effects

Authors: Sang Ho Lee, Yountaik Leem

Abstract:

This paper reviews Korean cases of smart cities through the analysis framework of strategic foci, characteristics and effects. Firstly, national strategies including c(cyber), e(electronic), u(ubiquitous) and s(smart) Korea strategies were considered from strategic angles. Secondly, the characteristics of smart cities in Korea were looked through the smart cities examples such as Seoul, Busan, Songdo and Sejong cities etc. from the views on the by STIM (Service, Technology, Infrastructure and Management) analysis. Finally, the effects of smart cities on socio-economies were investigated from industrial perspective using the input-output model and structural path analysis. Korean smart city strategies revealed that there were different kinds of strategic foci. c-Korea strategy focused on information and communications network building and user IT literacy. e-Korea strategy encouraged e-government and e-business through utilizing high-speed information and communications network. u-Korea strategy made ubiquitous service as well as integrated information and communication operations center. s-Korea strategy is propelling 4th industrial platform. Smart cities in Korea showed their own features and trends such as eco-intelligence, high efficiency and low cost oriented IoT, citizen sensored city, big data city. Smart city progress made new production chains fostering ICTs (Information Communication Technologies) and knowledge intermediate inputs to industries.

Keywords: Korean smart cities, Korean smart city strategies, STIM, smart service, infrastructure, technologies, management, effect of smart city

Procedia PDF Downloads 364
1933 Parkinson’s Disease Detection Analysis through Machine Learning Approaches

Authors: Muhtasim Shafi Kader, Fizar Ahmed, Annesha Acharjee

Abstract:

Machine learning and data mining are crucial in health care, as well as medical information and detection. Machine learning approaches are now being utilized to improve awareness of a variety of critical health issues, including diabetes detection, neuron cell tumor diagnosis, COVID 19 identification, and so on. Parkinson’s disease is basically a disease for our senior citizens in Bangladesh. Parkinson's Disease indications often seem progressive and get worst with time. People got affected trouble walking and communicating with the condition advances. Patients can also have psychological and social vagaries, nap problems, hopelessness, reminiscence loss, and weariness. Parkinson's disease can happen in both men and women. Though men are affected by the illness at a proportion that is around partial of them are women. In this research, we have to get out the accurate ML algorithm to find out the disease with a predictable dataset and the model of the following machine learning classifiers. Therefore, nine ML classifiers are secondhand to portion study to use machine learning approaches like as follows, Naive Bayes, Adaptive Boosting, Bagging Classifier, Decision Tree Classifier, Random Forest classifier, XBG Classifier, K Nearest Neighbor Classifier, Support Vector Machine Classifier, and Gradient Boosting Classifier are used.

Keywords: naive bayes, adaptive boosting, bagging classifier, decision tree classifier, random forest classifier, XBG classifier, k nearest neighbor classifier, support vector classifier, gradient boosting classifier

Procedia PDF Downloads 124
1932 Deep Learning-Based Automated Structure Deterioration Detection for Building Structures: A Technological Advancement for Ensuring Structural Integrity

Authors: Kavita Bodke

Abstract:

Structural health monitoring (SHM) is experiencing growth, necessitating the development of distinct methodologies to address its expanding scope effectively. In this study, we developed automatic structure damage identification, which incorporates three unique types of a building’s structural integrity. The first pertains to the presence of fractures within the structure, the second relates to the issue of dampness within the structure, and the third involves corrosion inside the structure. This study employs image classification techniques to discern between intact and impaired structures within structural data. The aim of this research is to find automatic damage detection with the probability of each damage class being present in one image. Based on this probability, we know which class has a higher probability or is more affected than the other classes. Utilizing photographs captured by a mobile camera serves as the input for an image classification system. Image classification was employed in our study to perform multi-class and multi-label classification. The objective was to categorize structural data based on the presence of cracks, moisture, and corrosion. In the context of multi-class image classification, our study employed three distinct methodologies: Random Forest, Multilayer Perceptron, and CNN. For the task of multi-label image classification, the models employed were Rasnet, Xceptionet, and Inception.

Keywords: SHM, CNN, deep learning, multi-class classification, multi-label classification

Procedia PDF Downloads 28
1931 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method

Authors: Angel G. De Leon Hernandez

Abstract:

A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.

Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming

Procedia PDF Downloads 115
1930 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 175
1929 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms

Authors: Sagri Sharma

Abstract:

Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.

Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine

Procedia PDF Downloads 423
1928 A Low Order Thermal Envelope Model for Heat Transfer Characteristics of Low-Rise Residential Buildings

Authors: Nadish Anand, Richard D. Gould

Abstract:

A simplistic model is introduced for determining the thermal characteristics of a Low-rise Residential (LRR) building and then predicts the energy usage by its Heating Ventilation & Air Conditioning (HVAC) system according to changes in weather conditions which are reflected in the Ambient Temperature (Outside Air Temperature). The LRR buildings are treated as a simple lump for solving the heat transfer problem and the model is derived using the lumped capacitance model of transient conduction heat transfer from bodies. Since most contemporary HVAC systems have a thermostat control which will have an offset temperature and user defined set point temperatures which define when the HVAC system will switch on and off. The aim is to predict without any error the Body Temperature (i.e. the Inside Air Temperature) which will estimate the switching on and off of the HVAC system. To validate the mathematical model derived from lumped capacitance we have used EnergyPlus simulation engine, which simulates Buildings with considerable accuracy. We have predicted through the low order model the Inside Air Temperature of a single house kept in three different climate zones (Detroit, Raleigh & Austin) and different orientations for summer and winter seasons. The prediction error from the model for the same day as that of model parameter calculation has showed an error of < 10% in winter for almost all the orientations and climate zones. Whereas the prediction error is only <10% for all the orientations in the summer season for climate zone at higher latitudes (Raleigh & Detroit). Possible factors responsible for the large variations are also noted in the work, paving way for future research.

Keywords: building energy, energy consumption, energy+, HVAC, low order model, lumped capacitance

Procedia PDF Downloads 264
1927 Integrating Road Safety into Mainstreaming Education and Other Initiatives with Holistic Approach in the State: A Case Study of Madhya Pradesh, India

Authors: Yogesh Mahor, Subhash Nigam, Abhai Khare

Abstract:

Road safety education is a composite subject which should be viewed holistically if taken into accoubehavior change communication, safe road infrastructure and low enforcement. Specific and customized road safety education is crucial for each type of road user and learners in the formal and informal teaching and various kind of training programs directly sponsored by state and center government, as they are active contributors to shaping a community and responsible citizens. The aim of this discussion article is to explore a strategy to integrate road safety education into the formal curriculum of schools, higher education institutions, driving schools, skill development centers, various government funded urban and rural development training institutions and their work plans as standing agenda. By applying the desktop research method, the article conceptualizes what the possible focus of road safety education and training should be. The article then explores international common practices in road safety education and training, and considers the necessary synergy between education, road engineering and low enforcement. The article uses secondary data collected from documents which are then analysed in a sectoral way. A well-designed road safety strategy for mainstreaming education and government-sponsored training is urgently needed, facilitating partnerships in various sectors to implement such education in the students and learners in multidisciplinary ways.

Keywords: road safety education, curriculum-based road safety education, behavior change communication, low enforcement, road engineering, safe system approach, infrastructure development consultants

Procedia PDF Downloads 120
1926 Estimating the Receiver Operating Characteristic Curve from Clustered Data and Case-Control Studies

Authors: Yalda Zarnegarnia, Shari Messinger

Abstract:

Receiver operating characteristic (ROC) curves have been widely used in medical research to illustrate the performance of the biomarker in correctly distinguishing the diseased and non-diseased groups. Correlated biomarker data arises in study designs that include subjects that contain same genetic or environmental factors. The information about correlation might help to identify family members at increased risk of disease development, and may lead to initiating treatment to slow or stop the progression to disease. Approaches appropriate to a case-control design matched by family identification, must be able to accommodate both the correlation inherent in the design in correctly estimating the biomarker’s ability to differentiate between cases and controls, as well as to handle estimation from a matched case control design. This talk will review some developed methods for ROC curve estimation in settings with correlated data from case control design and will discuss the limitations of current methods for analyzing correlated familial paired data. An alternative approach using Conditional ROC curves will be demonstrated, to provide appropriate ROC curves for correlated paired data. The proposed approach will use the information about the correlation among biomarker values, producing conditional ROC curves that evaluate the ability of a biomarker to discriminate between diseased and non-diseased subjects in a familial paired design.

Keywords: biomarker, correlation, familial paired design, ROC curve

Procedia PDF Downloads 232
1925 Overcoming Usability Challenges of Educational Math Apps: Designing and Testing a Mobile Graphing Calculator

Authors: M. Tomaschko

Abstract:

The integration of technology in educational settings has gained a lot of interest. Especially the use of mobile devices and accompanying mobile applications can offer great potentials to complement traditional education with new technologies and enrich students’ learning in various ways. Nevertheless, the usability of the deployed mathematics application is an indicative factor to exploit the full potential of technology enhanced learning because directing cognitive load toward using an application will likely inhibit effective learning. For this reason, the purpose of this research study is the identification of possible usability issues of the mobile GeoGebra Graphing Calculator application. Therefore, eye tracking in combination with task scenarios, think aloud method, and a SUS questionnaire were used. Based on the revealed usability issues, the mobile application was iteratively redesigned and assessed in order to verify the success of the usability improvements. In this paper, the identified usability issues are presented, and recommendations on how to overcome these concerns are provided. The main findings relate to the conception of a mathematics keyboard and the interaction design in relation to an equation editor, as well as the representation of geometrical construction tools. In total, 12 recommendations were formed to improve the usability of a mobile graphing calculator application. The benefit to be gained from this research study is not only the improvement of the usability of the existing GeoGebra Graphing Calculator application but also to provide helpful hints that could be considered from designers and developers of mobile math applications.

Keywords: GeoGebra, graphing calculator, math education, smartphone, usability

Procedia PDF Downloads 131
1924 Bring Your Own Device Security Model in a Financial Institution of South Africa

Authors: Michael Nthabiseng Moeti, Makhulu Relebogile Langa, Joey Jansen van Vuuren

Abstract:

This paper examines the utilization of personal electronic devices like laptops, tablets, and smartphones for professional duties within a financial organization. This phenomenon is known as bring your own device (BYOD). BYOD accords employees the freedom to use their personal devices to access corporate resources from anywhere in the world with Internet access. BYOD arrangements introduce significant security risks for both organizations and users. These setups change the threat landscape for enterprises and demand unique security strategies, as conventional tools tailored for safeguarding managed devices fall short in adequately protecting enterprise assets without active user cooperation. This paper applies protection motivation theory (PMT) to highlight behavioral risks from BYOD users that may impact the security of financial institutions. Thematic analysis was applied to gain a comprehensive understanding of how users perceive this phenomenon. These findings demonstrates that the existence of a security policy does not ensure that all employees will take measures to protect their personal devices. Active promotion of BYOD security policies is crucial for financial institution employees and management. This paper developed a BYOD security model which is useful for understanding compliant behaviors. Given that BYOD security is becoming a major concern across financial sector, it is important. The paper recommends that future research could expand the number of universities from which data is collected.

Keywords: BYOD, information security, protection motivation theory, security risks, thematic analysis

Procedia PDF Downloads 22
1923 Flame Volume Prediction and Validation for Lean Blowout of Gas Turbine Combustor

Authors: Ejaz Ahmed, Huang Yong

Abstract:

The operation of aero engines has a critical importance in the vicinity of lean blowout (LBO) limits. Lefebvre’s model of LBO based on empirical correlation has been extended to flame volume concept by the authors. The flame volume takes into account the effects of geometric configuration, the complex spatial interaction of mixing, turbulence, heat transfer and combustion processes inside the gas turbine combustion chamber. For these reasons, flame volume based LBO predictions are more accurate. Although LBO prediction accuracy has improved, it poses a challenge associated with Vf estimation in real gas turbine combustors. This work extends the approach of flame volume prediction previously based on fuel iterative approximation with cold flow simulations to reactive flow simulations. Flame volume for 11 combustor configurations has been simulated and validated against experimental data. To make prediction methodology robust as required in the preliminary design stage, reactive flow simulations were carried out with the combination of probability density function (PDF) and discrete phase model (DPM) in FLUENT 15.0. The criterion for flame identification was defined. Two important parameters i.e. critical injection diameter (Dp,crit) and critical temperature (Tcrit) were identified, and their influence on reactive flow simulation was studied for Vf estimation. Obtained results exhibit ±15% error in Vf estimation with experimental data.

Keywords: CFD, combustion, gas turbine combustor, lean blowout

Procedia PDF Downloads 261
1922 Exploring the Issue of Occult Hypoperfusion in the Pre-Hospital Setting

Authors: A. Fordham, A. Hudson

Abstract:

Background: Studies have suggested 16-25% of normotensive trauma patients with no clinical signs of shock have abnormal lactate and BD readings evidencing shock; a phenomenon known as occult hypoperfusion (OH). In light of the scarce quantity of evidence currently documenting OH, this study aimed to identify the prevalence of OH in the pre-hospital setting and explore ways to improve its identification and management. Methods: A quantitative retrospective data analysis was carried out on 75 sets of patient records for trauma patients treated by Kent Surrey Sussex Air Ambulance Trust between November 2013 and October 2014. The KSS HEMS notes and subsequent ED notes were collected. Trends between patients’ SBP on the scene, whether or not they received PRBCs on the scene as well as lactate and BD readings in the ED were assessed. Patients’ KSS HEMS notes written by the HEMS crew were also reviewed and recorded. Results: -Suspected OH was identified in 7% of the patients who did not receive PRBCs in the pre-hospital phase. -SBP heavily influences the physicians’ decision of whether or not to transfuse PRBCs in the pre-hospital phase. Preliminary conclusions: OH is an under-studied and underestimated phenomenon. We suggest a prospective trial is carried out to evaluate whether detecting trauma patients’ tissue perfusion status in the pre-hospital phase using portable devices capable of measuring serum BD and/or lactate could aid more accurate detection and management of all haemorrhaging trauma patients, including patients with OH.

Keywords: occult hypoperfusion, PRBC transfusion, point of care testing, pre-hospital emergency medicine, trauma

Procedia PDF Downloads 356
1921 Optimization of a Hand-Fan Shaped Microstrip Patch Antenna by Means of Orthogonal Design Method of Design of Experiments for L-Band and S-Band Applications

Authors: Jaswinder Kaur, Nitika, Navneet Kaur, Rajesh Khanna

Abstract:

A hand-fan shaped microstrip patch antenna (MPA) for L-band and S-band applications is designed, and its characteristics have been reconnoitered. The proposed microstrip patch antenna with double U-slot defected ground structure (DGS) is fabricated on an FR4 substrate which is a very readily available and inexpensive material. The suggested antenna is optimized using Orthogonal Design Method (ODM) of Design of Experiments (DOE) to cover the frequency range from 0.91-2.82 GHz for L-band and S-band applications. The L-band covers the frequency range of 1-2 GHz, which is allocated to telemetry, aeronautical, and military systems for passive satellite sensors, weather radars, radio astronomy, and mobile communication. The S-band covers the frequency range of 2-3 GHz, which is used by weather radars, surface ship radars and communication satellites and is also reserved for various wireless applications such as Worldwide Interoperability for Microwave Access (Wi-MAX), super high frequency radio frequency identification (SHF RFID), industrial, scientific and medical bands (ISM), Bluetooth, wireless broadband (Wi-Bro) and wireless local area network (WLAN). The proposed method of optimization is very time efficient and accurate as compared to the conventional evolutionary algorithms due to its statistical strategy. Moreover, the antenna is tested, followed by the comparison of simulated and measured results.

Keywords: design of experiments, hand fan shaped MPA, L-Band, orthogonal design method, S-Band

Procedia PDF Downloads 126
1920 Authenticity of Lipid and Soluble Sugar Profiles of Various Oat Cultivars (Avena sativa)

Authors: Marijana M. Ačanski, Kristian A. Pastor, Djura N. Vujić

Abstract:

The identification of lipid and soluble sugar components in flour samples of different cultivars belonging to common oat species (Avena sativa L.) was performed: spring oat, winter oat and hulless oat. Fatty acids were extracted from flour samples with n-hexane, and derivatized into volatile methyl esters, using TMSH (trimethylsulfonium hydroxide in methanol). Soluble sugars were then extracted from defatted and dried samples of oat flour with 96% ethanol, and further derivatized into corresponding TMS-oximes, using hydroxylamine hydrochloride solution and BSTFA (N,O-bis-(trimethylsilyl)-trifluoroacetamide). The hexane and ethanol extracts of each oat cultivar were analyzed using GC-MS system. Lipid and simple sugar compositions are very similar in all samples of investigated cultivars. Chemometric tool was applied to numeric values of automatically integrated surface areas of detected lipid and simple sugar components in their corresponding derivatized forms. Hierarchical cluster analysis shows a very high similarity between the investigated flour samples of oat cultivars, according to the fatty acid content (0.9955). Moderate similarity was observed according to the content of soluble sugars (0.50). These preliminary results support the idea of establishing methods for oat flour authentication, and provide the means for distinguishing oat flour samples, regardless of the variety, from flour samples made of other cereal species, just by lipid and simple sugar profile analysis.

Keywords: oat cultivars, lipid composition, soluble sugar composition, GC-MS, chemometrics, authentication

Procedia PDF Downloads 291
1919 Enhance Indoor Environment in Buildings and Its Effect on Improving Occupant's Health

Authors: Imad M. Assali

Abstract:

Recently, the world main problem is a global warming and climate change affecting both outdoor and indoor environments, especially the air quality (AQ) as a result of vast migration of people from rural areas to urban areas. Therefore, cities became more crowded and denser from an irregular population increase, along with increasing urbanization caused many problems for the environment such as increasing the land prices, changes in life style, and the new buildings are not adapted to the climate producing uncomfortable and unhealthy indoor building conditions. As interior environments are the places that create the most intimate relationship with the user. Consequently, the indoor environment quality (IEQ) for buildings became uncomfortable and unhealthy for its occupants. The symptoms commonly associated with poor indoor environment such as itchy, headache, fatigue, and respiratory complaints such as cough and congestion, etc. The symptoms tend to improve over time or even disappear when people are away from the building. Therefore, designing a healthy indoor environment to fulfill human needs is the main concern for architects and interior designer. However, this research explores how occupant expectations and environmental attitudes may influence occupant health and satisfaction within the context of the indoor environment. In doing so, it reviews and contributes to the methods and tools used to evaluate only the indoor environment quality (IEQ) components of building performance. Its main aim is to review the literature on indoor human comfort. This is followed by a review of previous papers published related to human comfort. Finally, this paper will provide possible approaches in design level of healthy buildings.

Keywords: sustainable building, indoor environment quality (IEQ), occupant's health, active system, sick building syndrome (SBS)

Procedia PDF Downloads 347
1918 Dynamic Process Model for Designing Smart Spaces Based on Context-Awareness and Computational Methods Principles

Authors: Heba M. Jahin, Ali F. Bakr, Zeyad T. Elsayad

Abstract:

As smart spaces can be defined as any working environment which integrates embedded computers, information appliances and multi-modal sensors to remain focused on the interaction between the users, their activity, and their behavior in the space; hence, smart space must be aware of their contexts and automatically adapt to their changing context-awareness, by interacting with their physical environment through natural and multimodal interfaces. Also, by serving the information used proactively. This paper suggests a dynamic framework through the architectural design process of the space based on the principles of computational methods and context-awareness principles to help in creating a field of changes and modifications. It generates possibilities, concerns about the physical, structural and user contexts. This framework is concerned with five main processes: gathering and analyzing data to generate smart design scenarios, parameters, and attributes; which will be transformed by coding into four types of models. Furthmore, connecting those models together in the interaction model which will represent the context-awareness system. Then, transforming that model into a virtual and ambient environment which represents the physical and real environments, to act as a linkage phase between the users and their activities taking place in that smart space . Finally, the feedback phase from users of that environment to be sure that the design of that smart space fulfill their needs. Therefore, the generated design process will help in designing smarts spaces that can be adapted and controlled to answer the users’ defined goals, needs, and activity.

Keywords: computational methods, context-awareness, design process, smart spaces

Procedia PDF Downloads 316
1917 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 463
1916 Identification of Deep Landslide on Erzurum-Turkey Highway by Geotechnical and Geophysical Methods and its Prevention

Authors: Neşe Işık, Şenol Altıok, Galip Devrim Eryılmaz, Aydın durukan, Hasan Özgür Daş

Abstract:

In this study, an active landslide zone affecting the road alignment on the Tortum-Uzundere (Erzurum/Turkey) highway was investigated. Due to the landslide movement, problems have occurred in the existing road pavement, which has caused both safety problems and reduced driving comfort in the operation of the road. In order to model the landslide, drilling, geophysical and inclinometer studies were carried out in the field within the scope of ground investigation. Laboratory tests were carried out on soil and rock samples obtained from the borings. When the drilling and geophysical studies were evaluated together, it was determined that the study area has a complex geological structure. In addition, according to the inclinometer results, the direction and speed of movement of the landslide mass were observed. In order to create an idealized geological profile, all field and laboratory studies were evaluated together and then the sliding surface of the landslide was determined by back analysis method. According to the findings obtained, it was determined that the landslide was massively large, and the movement occurred had a deep sliding surface. As a result of the numerical analyses, it was concluded that the Slope angle reduction is the most economical and environmentally friendly method for the control of the landslide mass.

Keywords: landslide, geotechnical methods, geophysics, monitoring, highway

Procedia PDF Downloads 62
1915 Implications of Meteorological Parameters in Decision Making for Public Protective Actions during a Nuclear Emergency

Authors: M. Hussaina, K. Mahboobb, S. Z. Ilyasa, S. Shaheena

Abstract:

Plume dispersion modeling is a computational procedure to establish a relationship between emissions, meteorology, atmospheric concentrations, deposition and other factors. The emission characteristics (stack height, stack diameter, release velocity, heat contents, chemical and physical properties of the gases/particle released etc.), terrain (surface roughness, local topography, nearby buildings) and meteorology (wind speed, stability, mixing height, etc.) are required for the modeling of the plume dispersion and estimation of ground and air concentration. During the early phase of Fukushima accident, plume dispersion modeling and decisions were taken for the implementation of protective measures. A difference in estimated results and decisions made by different countries for taking protective actions created a concern in local and international community regarding the exact identification of the safe zone. The current study is focused to highlight the importance of accurate and exact weather data availability, scientific approach for decision making for taking urgent protective actions, compatible and harmonized approach for plume dispersion modeling during a nuclear emergency. As a case study, the influence of meteorological data on plume dispersion modeling and decision-making process has been performed.

Keywords: decision making process, radiation doses, nuclear emergency, meteorological implications

Procedia PDF Downloads 176
1914 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms

Authors: Francisco M. Silva

Abstract:

Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.

Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare

Procedia PDF Downloads 116