Search results for: bivariate statistical techniques
8598 Effects of Group Cognitive Restructuring and Rational Emotive Behavioral Therapy on Psychological Distress of Awaiting-Trial Inmates in Correctional Centers in North-West, Nigeria
Authors: Muhammad Shafi’U Adamu
Abstract:
This study examined the effects of two groups of Cognitive Behavioral Therapies (CBT) which, includes Cognitive Restructuring (CB) and Rational Emotive Behavioral Therapy (REBT), on the Psychological Distress of awaiting-trial Inmates in Correctional Centers in North-West Nigeria. The study had four specific objectives, four research questions, and four null hypotheses. The study used a quasi-experimental design that involved pre-test and post-test. The population comprised of all 7,962 awaiting-trial inmates in correctional centers in North-west Nigeria. 131 awaiting trial inmates from three intact Correctional Centers were randomly selected using the census technique. The respondents were sampled and randomly put into 3 groups (CR, REBT and Control). Kessler Psychological Distress Scale (K10) was adapted for data collection in the study. The instrument was validated by experts and subjected to a pilot study using Cronbach's Alpha with a reliability coefficient of 0.772. Each group received treatment for 8 consecutive weeks (60 minutes/week). Data collected from the field were subjected to descriptive statistics of mean, standard deviation and mean difference to answer the research questions. Inferential statistics of ANOVA and independent sample t-test were used to test the null hypotheses at P≤ 0.05 level of significance. Results in the study revealed that there was no significant difference among the pre-treatment mean scores of experimental and control groups. Statistical evidence also showed a significant difference among the mean scores of the three groups, and thus, results of the Post Hoc multiple-comparison test indicated the posttreatment reduction of psychological distress in the awaiting-trial inmates. Documented output also showed a significant difference between the post-treatment psychologically distressed mean scores of male and female awaiting-trial inmates, but there was no difference in those exposed to REBT. The research recommends that a standardized structured CBT counseling technique treatment should be designed for correctional centers across Nigeria, and CBT counseling techniques could be used in the treatment of PD in both correctional and clinical settings.Keywords: awaiting-trial inmates, cognitive restructuring, correctional centers, rational emotive behavioral therapy
Procedia PDF Downloads 748597 Surgical Applied Anatomy: Alive and Kicking
Authors: Jake Hindmarch, Edward Farley, Norman Eizenberg, Mark Midwinter
Abstract:
There is a need to bring the anatomical knowledge of medical students up to the standards required by surgical specialties. Contention exists amongst anatomists, clinicians, and surgeons about the standard of anatomical knowledge medical students need. The aim of this study was to explore the standards which the Royal Australasian College of Surgeons are applying knowledge of anatomy. Furthermore, to align medical school teaching to what the surgical profession requires from graduates.: The 2018 volume of the ANZ Journal of Surgery was narrowed down to 254 articles by applying the search term “Anatomy”. The main topic was then extracted from each paper. The content of the paper was assessed for ‘novel description’ or ‘application’ of anatomical knowledge’ and classified accordingly. The majority of papers with an anatomical focus was from the general surgery specialty, which focused on surgical techniques, outcomes and management. Vascular surgery had the highest percentage of papers with a novel description and application of anatomy. Cardiothoracic and paediatric surgery had no papers with a novel description of anatomy. Finally, a novel application of anatomy was the main focus of each speciality. Firstly, a high proportion of novel applications and descriptions of anatomy are in general surgery. Secondly, vascular surgery had the largest proportion of novel application and description of anatomy, namely due to the rise of therapeutic imaging and endovascular techniques. Finally, all disciplines demonstrated a trend towards having a higher proportion of novel application of anatomical knowledgeKeywords: anatomical knowledge, anatomy, surgery, novel anatomy
Procedia PDF Downloads 1168596 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 758595 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples
Authors: Saifullah Karimullah
Abstract:
Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine
Procedia PDF Downloads 1028594 Technique for Online Condition Monitoring of Surge Arresters
Authors: Anil S. Khopkar, Kartik S. Pandya
Abstract:
Overvoltage in power systems is a phenomenon that cannot be avoided. However, it can be controlled to a certain extent. Power system equipment is to be protected against overvoltage to avoid system failure. Metal Oxide Surge Arresters (MOSA) are connected to the system for the protection of the power system against overvoltages. The MOSA will behave as an insulator under normal working conditions, where it offers a conductive path under voltage conditions. MOSA consists of zinc oxide elements (ZnO Blocks), which have non-linear V-I characteristics. ZnO blocks are connected in series and fitted in ceramic or polymer housing. This degrades due to the aging effect under continuous operation. Degradation of zinc oxide elements increases the leakage current flowing from the surge arresters. This Increased leakage current results in the increased temperature of the surge arrester, which further decreases the resistance of zinc oxide elements. As a result, leakage current increases, which again increases the temperature of a MOSA. This creates thermal runaway conditions for MOSA. Once it reaches the thermal runaway condition, it cannot return to normal working conditions. This condition is a primary cause of premature failure of surge arresters, as MOSA constitutes a core protective device for electrical power systems against transients. It contributes significantly to the reliable operation of the power system network. Hence, the condition monitoring of surge arresters should be done at periodic intervals. Online and Offline condition monitoring techniques are available for surge arresters. Offline condition monitoring techniques are not very popular as they require removing surge arresters from the system, which requires system shutdown. Hence, online condition monitoring techniques are very popular. This paper presents the evaluation technique for the surge arrester condition based on the leakage current analysis. Maximum amplitude of total leakage current (IT), Maximum amplitude of fundamental resistive leakage current (IR) and maximum amplitude of third harmonic resistive leakage current (I3rd) have been analyzed as indicators for surge arrester condition monitoring.Keywords: metal oxide surge arrester (MOSA), over voltage, total leakage current, resistive leakage current
Procedia PDF Downloads 648593 A Systematic Review: Prevalence and Risk Factors of Low Back Pain among Waste Collection Workers
Authors: Benedicta Asante, Brenna Bath, Olugbenga Adebayo, Catherine Trask
Abstract:
Background: Waste Collection Workers’ (WCWs) activities contribute greatly to the recycling sector and are an important component of the waste management industry. As the recycling sector evolves, reports of injuries and fatal accidents in the industry demand notice particularly common and debilitating musculoskeletal disorders such as low back pain (LBP). WCWs are likely exposed to diverse work-related hazards that could contribute to LBP. However, to our knowledge there has never been a systematic review or other synthesis of LBP findings within this workforce. The aim of this systematic review was to determine the prevalence and risk factors of LBP among WCWs. Method: A comprehensive search was conducted in Ovid Medline, EMBASE, and Global Health e-publications with search term categories ‘low back pain’ and ‘waste collection workers’. Articles were screened at title, abstract, and full-text stages by two reviewers. Data were extracted on study design, sampling strategy, socio-demographic, geographical region, and exposure definition, definition of LBP, risk factors, response rate, statistical techniques, and LBP prevalence. Risk of bias (ROB) was assessed based on Hoy Damien’s ROB scale. Results: The search of three databases generated 79 studies. Thirty-two studies met the study inclusion criteria for both title and abstract; thirteen full-text articles met the study criteria at the full-text stage. Seven articles (54%) reported prevalence within 12 months of LBP between 42-82% among WCW. The major risk factors for LBP among WCW included: awkward posture; lifting; pulling; pushing; repetitive motions; work duration; and physical loads. Summary data and syntheses of findings was presented in trend-lines and tables to establish the several prevalence periods based on age and region distribution. Public health implications: LBP is a major occupational hazard among WCWs. In light of these risks and future growth in this industry, further research should focus on more detail ergonomic exposure assessment and LBP prevention efforts.Keywords: low back pain, scavenger, waste collection workers, waste pickers
Procedia PDF Downloads 3268592 Methods for Restricting Unwanted Access on the Networks Using Firewall
Authors: Bhagwant Singh, Sikander Singh Cheema
Abstract:
This paper examines firewall mechanisms routinely implemented for network security in depth. A firewall can't protect you against all the hazards of unauthorized networks. Consequently, many kinds of infrastructure are employed to establish a secure network. Firewall strategies have already been the subject of significant analysis. This study's primary purpose is to avoid unnecessary connections by combining the capability of the firewall with the use of additional firewall mechanisms, which include packet filtering and NAT, VPNs, and backdoor solutions. There are insufficient studies on firewall potential and combined approaches, but there aren't many. The research team's goal is to build a safe network by integrating firewall strength and firewall methods. The study's findings indicate that the recommended concept can form a reliable network. This study examines the characteristics of network security and the primary danger, synthesizes existing domestic and foreign firewall technologies, and discusses the theories, benefits, and disadvantages of different firewalls. Through synthesis and comparison of various techniques, as well as an in-depth examination of the primary factors that affect firewall effectiveness, this study investigated firewall technology's current application in computer network security, then introduced a new technique named "tight coupling firewall." Eventually, the article discusses the current state of firewall technology as well as the direction in which it is developing.Keywords: firewall strategies, firewall potential, packet filtering, NAT, VPN, proxy services, firewall techniques
Procedia PDF Downloads 998591 Validation of Escherichia coli O157:H7 Inactivation on Apple-Carrot Juice Treated with Manothermosonication by Kinetic Models
Authors: Ozan Kahraman, Hao Feng
Abstract:
Several models such as Weibull, Modified Gompertz, Biphasic linear, and Log-logistic models have been proposed in order to describe non-linear inactivation kinetics and used to fit non-linear inactivation data of several microorganisms for inactivation by heat, high pressure processing or pulsed electric field. First-order kinetic parameters (D-values and z-values) have often been used in order to identify microbial inactivation by non-thermal processing methods such as ultrasound. Most ultrasonic inactivation studies employed first-order kinetic parameters (D-values and z-values) in order to describe the reduction on microbial survival count. This study was conducted to analyze the E. coli O157:H7 inactivation data by using five microbial survival models (First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic). First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic kinetic models were used for fitting inactivation curves of Escherichia coli O157:H7. The residual sum of squares and the total sum of squares criteria were used to evaluate the models. The statistical indices of the kinetic models were used to fit inactivation data for E. coli O157:H7 by MTS at three temperatures (40, 50, and 60 0C) and three pressures (100, 200, and 300 kPa). Based on the statistical indices and visual observations, the Weibull and Biphasic models were best fitting of the data for MTS treatment as shown by high R2 values. The non-linear kinetic models, including the Modified Gompertz, First-order, and Log-logistic models did not provide any better fit to data from MTS compared the Weibull and Biphasic models. It was observed that the data found in this study did not follow the first-order kinetics. It is possibly because of the cells which are sensitive to ultrasound treatment were inactivated first, resulting in a fast inactivation period, while those resistant to ultrasound were killed slowly. The Weibull and biphasic models were found as more flexible in order to determine the survival curves of E. coli O157:H7 treated by MTS on apple-carrot juice.Keywords: Weibull, Biphasic, MTS, kinetic models, E.coli O157:H7
Procedia PDF Downloads 3618590 Evaluating the Implementation of Machine Learning Techniques in the South African Built Environment
Authors: Peter Adekunle, Clinton Aigbavboa, Matthew Ikuabe, Opeoluwa Akinradewo
Abstract:
The future of machine learning (ML) in building may seem like a distant idea that will take decades to materialize, but it is actually far closer than previously believed. In reality, the built environment has been progressively increasing interest in machine learning. Although it could appear to be a very technical, impersonal approach, it can really make things more personable. Instead of eliminating humans out of the equation, machine learning allows people do their real work more efficiently. It is therefore vital to evaluate the factors influencing the implementation and challenges of implementing machine learning techniques in the South African built environment. The study's design was one of a survey. In South Africa, construction workers and professionals were given a total of one hundred fifty (150) questionnaires, of which one hundred and twenty-four (124) were returned and deemed eligible for study. Utilizing percentage, mean item scores, standard deviation, and Kruskal-Wallis, the collected data was analyzed. The results demonstrate that the top factors influencing the adoption of machine learning are knowledge level and a lack of understanding of its potential benefits. While lack of collaboration among stakeholders and lack of tools and services are the key hurdles to the deployment of machine learning within the South African built environment. The study came to the conclusion that ML adoption should be promoted in order to increase safety, productivity, and service quality within the built environment.Keywords: machine learning, implementation, built environment, construction stakeholders
Procedia PDF Downloads 1308589 Biosensor Technologies in Neurotransmitters Detection
Authors: Joanna Cabaj, Sylwia Baluta, Karol Malecha
Abstract:
Catecholamines are vital neurotransmitters that mediate a variety of central nervous system functions, such as motor control, cognition, emotion, memory processing, and endocrine modulation. Dysfunctions in catecholamine neurotransmission are induced in some neurologic and neuropsychiatric diseases. Changeable neurotransmitters level in biological fluids can be a marker of several neurological disorders. Because of its significance in analytical techniques and diagnostics, sensitive and selective detection of neurotransmitters is increasingly attracting a lot of attention in different areas of bio-analysis or biomedical research. Recently, optical techniques for the detection of catecholamines have attracted interests due to their reasonable cost, convenient control, as well as maneuverability in biological environments. Nevertheless, with the observed need for a sensitive and selective catecholamines sensor, the development of a convenient method for this neurotransmitter is still at its basic level. The manipulation of nanostructured materials in conjunction with biological molecules has led to the development of a new class of hybrid-modified enzymatic sensors in which both enhancement of charge transport and biological activity preservation may be obtained. Immobilization of biomaterials on electrode surfaces is the crucial step in fabricating electrochemical as well as optical biosensors and bioelectronic devices. Continuing systematic investigation in manufacturing of enzyme–conducting sensitive systems, here is presented a convenient fluorescence as well as electrochemical sensing strategy for catecholamines detection.Keywords: biosensors, catecholamines, fluorescence, enzymes
Procedia PDF Downloads 1098588 Statistical Approach to Identify Stress and Biases Impairing Decision-Making in High-Risk Industry
Authors: Ph. Fauquet-Alekhine
Abstract:
Decision-making occurs several times an hour when working in high risk industry and an erroneous choice might have undesirable outcomes for people and the environment surrounding the industrial plant. Industrial decisions are very often made in a context of acute stress. Time pressure is a crucial stressor leading decision makers sometimes to boost up the decision-making process and if it is not possible then shift to the simplest strategy. We thus found it interesting to update the characterization of the stress factors impairing decision-making at Chinon Nuclear Power Plant (France) in order to optimize decision making contexts and/or associated processes. The investigation was based on the analysis of reports addressing safety events over the last 3 years. Among 93 reports, those explicitly addressing decision-making issues were identified. Characterization of each event was undertaken in terms of three criteria: stressors, biases impairing decision making and weaknesses of the decision-making process. The statistical analysis showed that biases were distributed over 10 possibilities among which the hypothesis confirmation bias was clearly salient. No significant correlation was found between criteria. The analysis indicated that the main stressor was time pressure and highlights an unexpected form of stressor: the trust asymmetry principle of the expert. The analysis led to the conclusion that this stressor impaired decision-making from a psychological angle rather than from a physiological angle: it induces defensive bias of self-esteem, self-protection associated with a bias of confirmation. This leads to the hypothesis that this stressor can intervene in some cases without being detected, and to the hypothesis that other stressors of the same kind might occur without being detected too. Further investigations addressing these hypotheses are considered. The analysis also led to the conclusion that dealing with these issues implied i) decision-making methods being well known to the workers and automated and ii) the decision-making tools being well known and strictly applied. Training was thus adjusted.Keywords: bias, expert, high risk industry, stress.
Procedia PDF Downloads 1118587 The Accuracy of an 8-Minute Running Field Test to Estimate Lactate Threshold
Authors: Timothy Quinn, Ronald Croce, Aliaksandr Leuchanka, Justin Walker
Abstract:
Many endurance athletes train at or just below an intensity associated with their lactate threshold (LT) and often the heart rate (HR) that these athletes use for their LT are above their true LT-HR measured in a laboratory. Training above their true LT-HR may lead to overtraining and injury. Few athletes have the capability of measuring their LT in a laboratory and rely on perception to guide them, as accurate field tests to determine LT are limited. Therefore, the purpose of this study was to determine if an 8-minute field test could accurately define the HR associated with LT as measured in the laboratory. On Day 1, fifteen male runners (mean±SD; age, 27.8±4.1 years; height, 177.9±7.1 cm; body mass, 72.3±6.2 kg; body fat, 8.3±3.1%) performed a discontinuous treadmill LT/maximal oxygen consumption (LT/VO2max) test using a portable metabolic gas analyzer (Cosmed K4b2) and a lactate analyzer (Analox GL5). The LT (and associated HR) was determined using the 1/+1 method, where blood lactate increased by 1 mMol•L-1 over baseline followed by an additional 1 mMol•L-1 increase. Days 2 and 3 were randomized, and the athletes performed either an 8-minute run on the treadmill (TM) or on a 160-m indoor track (TR) in an effort to cover as much distance as possible while maintaining a high intensity throughout the entire 8 minutes. VO2, HR, ventilation (VE), and respiratory exchange ratio (RER) were measured using the Cosmed system, and rating of perceived exertion (RPE; 6-20 scale) was recorded every minute. All variables were averaged over the 8 minutes. The total distance covered over the 8 minutes was measured in both conditions. At the completion of the 8-minute runs, blood lactate was measured. Paired sample t-tests and pairwise Pearson correlations were computed to determine the relationship between variables measured in the field tests versus those obtained in the laboratory at LT. An alpha level of <0.05 was required for statistical significance. The HR (mean +SD) during the TM (167+9 bpm) and TR (172+9 bpm) tests were strongly correlated to the HR measured during the laboratory LT (169+11 bpm) test (r=0.68; p<0.03 and r=0.88; p<0.001, respectively). Blood lactate values during the TM and TR tests were not different from each other but were strongly correlated with the laboratory LT (r=0.73; p<0.04 and r=0.66; p<0.05, respectively). VE (Lmin-1) was significantly greater during the TR (134.8+11.4 Lmin-1) as compared to the TM (123.3+16.2 Lmin-1) with moderately strong correlations to the laboratory threshold values (r=0.38; p=0.27 and r=0.58; p=0.06, respectively). VO2 was higher during TR (51.4 mlkg-1min-1) compared to TM (47.4 mlkg-1min-1) with correlations of 0.33 (p=0.35) and 0.48 (p=0.13), respectively to threshold values. Total distance run was significantly greater during the TR (2331.6+180.9 m) as compared to the TM (2177.0+232.6 m), but they were strongly correlated with each other (r=0.82; p<0.002). These results suggest that an 8-minute running field test can accurately predict the HR associated with the LT and may be a simple test that athletes and coaches could implement to aid in training techniques.Keywords: blood lactate, heart rate, running, training
Procedia PDF Downloads 2518586 Ultrasonic Evaluation of Periodic Rough Inaccessible Surfaces from Back Side
Authors: Chanh Nghia Nguyen, Yu Kurokawa, Hirotsugu Inoue
Abstract:
The surface roughness is an important parameter for evaluating the quality of material surfaces since it affects functions and performance of industrial components. Although stylus and optical techniques are commonly used for measuring the surface roughness, they are applicable only to accessible surfaces. In practice, surface roughness measurement from the back side is sometimes demanded, for example, in inspection of safety-critical parts such as inner surface of pipes. However, little attention has been paid to the measurement of back surface roughness so far. Since back surface is usually inaccessible by stylus or optical techniques, ultrasonic technique is one of the most effective among others. In this research, an ultrasonic pulse-echo technique is considered for evaluating the pitch and the height of back surface having periodic triangular profile as a very first step. The pitch of the surface profile is measured by applying the diffraction grating theory for oblique incidence; then the height is evaluated by numerical analysis based on the Kirchhoff theory for normal incidence. The validity of the proposed method was verified by both numerical simulation and experiment. It was confirmed that the pitch is accurately measured in most cases. The height was also evaluated with good accuracy when it is smaller than a half of the pitch because of the approximation in the Kirchhoff theory.Keywords: back side, inaccessible surface, periodic roughness, pulse-echo technique, ultrasonic NDE
Procedia PDF Downloads 2748585 Signal Processing Techniques for Adaptive Beamforming with Robustness
Authors: Ju-Hong Lee, Ching-Wei Liao
Abstract:
Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.Keywords: adaptive beamforming, robustness, signal blocking, steering angle error
Procedia PDF Downloads 1228584 Energy Efficient Lighting in Educational Buildings through the Example of a High School in Istanbul
Authors: Nihan Gurel Ulusan
Abstract:
It is obvious that electrical energy, which is an inseparable part of modern day’s human and also the most important power source of our age, should be generated on a level that will suffice the nation’s requirements. The electrical energy used for a sustainable architectural design should be reduced as much as possible. Designing the buildings as energy efficient systems which aim at reducing the artificial illumination loads has been a current subject of our times as a result of concepts gaining importance like conscious consumption of energy sources, environment-friendly designs and sustainability. Reducing the consumption of electrical energy regarding the artificial lighting carries great significance, especially in the volumes which are used all day long like the educational buildings. Starting out with such an aim in this paper, the educational buildings are explored in terms of energy efficient lighting. Firstly, illumination techniques, illumination systems, light sources, luminaries, illumination controls and 'efficient energy' usage in lighting are mentioned. In addition, natural and artificial lighting systems used in educational buildings and also the spaces building up these kind buildings are examined in terms of energy efficient lighting. Lastly, the illumination properties of the school sample chosen for this study, Kağıthane Anadolu Lisesi, a typical high school in Istanbul, is observed. Suggestions are made in order to improve the system by evaluating the illumination properties of the classes with the survey carried out with the users.Keywords: educational buildings, energy efficient, illumination techniques, lighting
Procedia PDF Downloads 2798583 Resilient Machine Learning in the Nuclear Industry: Crack Detection as a Case Study
Authors: Anita Khadka, Gregory Epiphaniou, Carsten Maple
Abstract:
There is a dramatic surge in the adoption of machine learning (ML) techniques in many areas, including the nuclear industry (such as fault diagnosis and fuel management in nuclear power plants), autonomous systems (including self-driving vehicles), space systems (space debris recovery, for example), medical surgery, network intrusion detection, malware detection, to name a few. With the application of learning methods in such diverse domains, artificial intelligence (AI) has become a part of everyday modern human life. To date, the predominant focus has been on developing underpinning ML algorithms that can improve accuracy, while factors such as resiliency and robustness of algorithms have been largely overlooked. If an adversarial attack is able to compromise the learning method or data, the consequences can be fatal, especially but not exclusively in safety-critical applications. In this paper, we present an in-depth analysis of five adversarial attacks and three defence methods on a crack detection ML model. Our analysis shows that it can be dangerous to adopt machine learning techniques in security-critical areas such as the nuclear industry without rigorous testing since they may be vulnerable to adversarial attacks. While common defence methods can effectively defend against different attacks, none of the three considered can provide protection against all five adversarial attacks analysed.Keywords: adversarial machine learning, attacks, defences, nuclear industry, crack detection
Procedia PDF Downloads 1578582 Teaching and Learning Jazz Improvisation Using Bloom's Taxonomy of Learning Domains
Authors: Graham Wood
Abstract:
The 20th Century saw the introduction of many new approaches to music making, including the structured and academic study of jazz improvisation. The rise of many school and tertiary jazz programs was rapid and quickly spread around the globe in a matter of decades. It could be said that the curriculum taught in these new programs was often developed in an ad-hoc manner due to the lack of written literature in this new and rapidly expanding area and the vastly different pedagogical principles when compared to classical music education that was prevalent in school and tertiary programs. There is widespread information regarding the theory and techniques used by jazz improvisers, but methods to practice these concepts in order to achieve the best outcomes for students and teachers is much harder to find. This research project explores the authors’ experiences as a studio jazz piano teacher, ensemble teacher and classroom improvisation lecturer over fifteen years and suggests an alignment with Bloom’s taxonomy of learning domains. This alignment categorizes the different tasks that need to be taught and practiced in order for the teacher and the student to devise a well balanced and effective practice routine and for the teacher to develop an effective teaching program. These techniques have been very useful to the teacher and the student to ensure that a good balance of cognitive, psychomotor and affective skills are taught to the students in a range of learning contexts.Keywords: bloom, education, jazz, learning, music, teaching
Procedia PDF Downloads 2558581 Earthquake Identification to Predict Tsunami in Andalas Island, Indonesia Using Back Propagation Method and Fuzzy TOPSIS Decision Seconder
Authors: Muhamad Aris Burhanudin, Angga Firmansyas, Bagus Jaya Santosa
Abstract:
Earthquakes are natural hazard that can trigger the most dangerous hazard, tsunami. 26 December 2004, a giant earthquake occurred in north-west Andalas Island. It made giant tsunami which crushed Sumatra, Bangladesh, India, Sri Lanka, Malaysia and Singapore. More than twenty thousand people dead. The occurrence of earthquake and tsunami can not be avoided. But this hazard can be mitigated by earthquake forecasting. Early preparation is the key factor to reduce its damages and consequences. We aim to investigate quantitatively on pattern of earthquake. Then, we can know the trend. We study about earthquake which has happened in Andalas island, Indonesia one last decade. Andalas is island which has high seismicity, more than a thousand event occur in a year. It is because Andalas island is in tectonic subduction zone of Hindia sea plate and Eurasia plate. A tsunami forecasting is needed to mitigation action. Thus, a Tsunami Forecasting Method is presented in this work. Neutral Network has used widely in many research to estimate earthquake and it is convinced that by using Backpropagation Method, earthquake can be predicted. At first, ANN is trained to predict Tsunami 26 December 2004 by using earthquake data before it. Then after we get trained ANN, we apply to predict the next earthquake. Not all earthquake will trigger Tsunami, there are some characteristics of earthquake that can cause Tsunami. Wrong decision can cause other problem in the society. Then, we need a method to reduce possibility of wrong decision. Fuzzy TOPSIS is a statistical method that is widely used to be decision seconder referring to given parameters. Fuzzy TOPSIS method can make the best decision whether it cause Tsunami or not. This work combines earthquake prediction using neural network method and using Fuzzy TOPSIS to determine the decision that the earthquake triggers Tsunami wave or not. Neural Network model is capable to capture non-linear relationship and Fuzzy TOPSIS is capable to determine the best decision better than other statistical method in tsunami prediction.Keywords: earthquake, fuzzy TOPSIS, neural network, tsunami
Procedia PDF Downloads 4898580 Histochemical Localization of Hepatitis B Surface Antigen in Hepatocellular Carcinoma: An Evaluation of Two Staining Techniques in a Tertiary Hospital in Calabar, Nigeria
Authors: Imeobong Joseph Inyang, Aniekan-Augusta Okon Eyo, Abel William Essien
Abstract:
Hepatitis B virus (HBV) is one of the known human carcinogens. The presence of HBsAg in liver tissues indicates active viral replication. More than 85% of Hepatocellular Carcinoma (HCC) cases occur in countries with increased rates of chronic HBV infection. An evaluation study to determine the relationship between positivity for HBsAg and development of HCC and its distribution between age and gender of subjects was done. Shikata Orcein and Haematoxylin and Eosin (H&E) staining techniques were performed on liver sections. A total of 50 liver tissue specimens comprising 38 biopsy and 12 post-mortem specimens were processed. Thirty-five of the 50 specimens were positive for HBsAg with Orcein stain whereas only 16 were positive with H&E stain, and these were also positive with Orcein stain, giving an HBsAg prevalence of 70.0% (35/50). The prevalence of HCC in the study was 56.0% (28/50), of which 21 (75.0%) cases were positive for HBsAg, 18 (64.3%) were males while 10 (35.7%) were females distributed within the age range of 20-70 years. The highest number of HBsAg positive HCC cases, 7/21 (33.3%) occurred in the age group 40-49 years. There was no relationship in the pattern of distribution of HCC between age and gender using the Pearson correlation coefficient (r = 0.0474; P < 0.05). HBV infection predisposed to HCC. Orcein technique was more specific and is therefore recommended for screening of liver tissues where facilities for immunohistochemistry are inaccessible.Keywords: Hepatitis B. surface antigen, hepatocellular carcinoma, orcein, pathology
Procedia PDF Downloads 3128579 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region
Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R.M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari
Abstract:
Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool have been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will provide consistent, regular and reliable information regarding natural and anthropogenic ground motion phenomena all over Europe.Keywords: ground displacements, InSAR, natural hazards, satellite imagery
Procedia PDF Downloads 2178578 The Effect of Core Training on Physical Fitness Characteristics in Male Volleyball Players
Authors: Sibel Karacaoglu, Fatma Ç. Kayapinar
Abstract:
The aim of the study is to investigate the effect of the core training program on physical fitness characteristics and body composition in male volleyball players. 26 male university volleyball team players aged between 19 to 24 years who had no health problems and injury participated in the study. Subjects were divided into training (TG) and control groups (CG) as randomly. Data from twenty-one players who completed all training sessions were used for statistical analysis (TG,n=11; CG,n=10). A core training program was applied to the training group three days a week for 10 weeks. On the other hand, the control group did not receive any training. Before and after the 10-week training program, pre- and post-testing comprised of body composition measurements (weight, BMI, bioelectrical impedance analysis) and physical fitness measurements including flexibility (sit and reach test), muscle strength (back, leg and grip strength by dynamometer), muscle endurance (sit-ups and push-ups tests), power (one-legged jump and vertical jump tests), speed (20m sprint, 30m sprint) and balance tests (one-legged standing test) were performed. Changes of pre- and post- test values of the groups were determined by using dependent t test. According to the statistical analysis of data, no significant difference was found in terms of body composition in the both groups for pre- and post- test values. In the training group, all physical fitness measurements improved significantly after core training program (p<0.05) except 30m speed and handgrip strength (p>0.05). On the hand, only 20m speed test values improved after post-test period (p<0.05), but the other physical fitness tests values did not differ (p>0.05) between pre- and post- test measurement in the control group. The results of the study suggest that the core training program has positive effect on physical fitness characteristics in male volleyball players.Keywords: body composition, core training, physical fitness, volleyball
Procedia PDF Downloads 3458577 Cognitive Approach at the Epicenter of Creative Accounting in Cameroonian Companies: The Relevance of the Psycho-Sociological Approach and the Theory of Cognitive Dissonance
Authors: Romuald Temomo Wamba, Robert Wanda
Abstract:
The issue of creative accounting in the psychological and sociological framework has been a mixed subject for over 60 years. The objective of this article is to ensure the existence of creative accounting in Cameroonian entities on the one hand and to understand the strategies used by audit agents to detect errors, omissions, irregularities, or inadequacies in the financial state; optimization techniques used by account preparers to strategically bypass texts on the other hand. To achieve this, we conducted an exploratory study using a cognitive approach, and the data analysis was performed by the software 'decision explorer'. The results obtained challenge the authors' cognition (manifest latent and deceptive behavior). The tax inspectors stress that the entities in Cameroon do not derogate from the rules of piloting in the financial statements. Likewise, they claim a change in current income and net income through depreciation, provisions, inventories, and the spreading of charges over long periods. This suggests the suspicion or intention of manipulating the financial statements. As for the techniques, the account preparers manage the accruals at the end of the year as the basis of the practice of creative accounting. Likewise, management accounts are more favorable to results management.Keywords: creative accounting, sociocognitive approach, psychological and sociological approach, cognitive dissonance theory, cognitive mapping
Procedia PDF Downloads 1918576 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 1318575 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms
Authors: Hai L. Tran
Abstract:
When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.Keywords: data, narrative, number, anecdote, storytelling, news
Procedia PDF Downloads 788574 Comparative Assessment on the Impact of Sedatives on the Stress and Anxiety of Patients with a Heart Disease before and during Surgery in Iran
Authors: Farhad Fakoursevom
Abstract:
Heart disease is one of the diseases which is found in abundance today. Various types of surgeries, such as bypasses, angiography, angioplasty, etc., are used to treat patients. People may receive such surgeries, some of which are invasive and some non-invasive, throughout their lives. People might cope with pre-surgery anxiety and stress, which can disrupt their normal life and even reduce the effects of the surgery, so the desired result can not be achieved in surgery. Considering this issue, the present study aimed to do a comparative assessment of people who received sedatives before surgery and people who did not receive sedatives. In terms of the purpose, this is an applied research and descriptive survey in terms of method. The statistical population included patients who underwent surgeries in the specialist heart hospitals of Mashhad, Iran; 60 people were considered as a statistical population, 30 of them received sedatives before surgery, and 30 others had not received sedatives before surgery. Valid and up-to-date articles were systematically used to collect theoretical bases, and a researcher-made questionnaire was used to examine the level of stress and anxiety of people. The questionnaire content validity was assessed by a panel of experts in psychology and medicine. The construct validity was tested using the software. Cronbach's alpha and composite reliability were used for reliability, which shows the appropriate reliability of the questionnaire. SPSS software was used to compare the research results between two groups, and the research findings showed that there is no significant association between the people who received sedatives and those who did not receive sedatives in terms of the amount of stress and anxiety. The longer the time of taking the drugs before the surgery, the more the mental peace of the patients will be. According to the results, it can be said that if we don't need to have an emergency operation and need more time, we have to use sedative drugs with different doses compared to the severity of the surgery, and also in case of a medical emergency such as heart surgery due to a stroke, we have to take advantage of psychological services during and before the operation and sedative drugs so that the patients can control their stress and anxiety and achieve better outcomes.Keywords: sedative drugs, stress, anxiety, surgery
Procedia PDF Downloads 978573 Sustainability and Clustering: A Bibliometric Assessment
Authors: Fernanda M. Assef, Maria Teresinha A. Steiner, David Gabriel F. Barros
Abstract:
Review researches are useful in terms of analysis of research problems. Between the types of review documents, we commonly find bibliometric studies. This type of application often helps the global visualization of a research problem and helps academics worldwide to understand the context of a research area better. In this document, a bibliometric view surrounding clustering techniques and sustainability problems is presented. The authors aimed at which issues mostly use clustering techniques, and, even which sustainability issue would be more impactful on today’s moment of research. During the bibliometric analysis, we found ten different groups of research in clustering applications for sustainability issues: Energy; Environmental; Non-urban planning; Sustainable Development; Sustainable Supply Chain; Transport; Urban Planning; Water; Waste Disposal; and, Others. And, by analyzing the citations of each group, we discovered that the Environmental group could be classified as the most impactful research cluster in the area mentioned. Now, after the content analysis of each paper classified in the environmental group, we found that the k-means technique is preferred for solving sustainability problems with clustering methods since it appeared the most amongst the documents. The authors finally conclude that a bibliometric assessment could help indicate a gap of researches on waste disposal – which was the group with the least amount of publications – and the most impactful research on environmental problems.Keywords: bibliometric assessment, clustering, sustainability, territorial partitioning
Procedia PDF Downloads 1078572 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry
Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood
Abstract:
The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.Keywords: ADV, experimental data, multiple Reynolds number, post-processing
Procedia PDF Downloads 1468571 Design of Cylindrical Crawler Robot Inspired by Amoeba Locomotion
Authors: Jun-ya Nagase
Abstract:
Recently, the need of colonoscopy is increasing because of the rise of colonic disorder including cancer of the colon. However, current colonoscopy depends on doctor's skill strongly. Therefore, a large intestine endoscope that does not depend on the techniques of a doctor with high safety is required. In this research, we aim at development a novel large intestine endoscope that can realize safe insertion without specific techniques. A wheel movement type robot, a snake-like robot and an earthworm-like robot are all described in the relevant literature as endoscope robots that are currently studied. Among them, the tracked crawler robot can travel by traversing uneven ground flexibly with a crawler belt attached firmly to the ground surface. Although conventional crawler robots have high efficiency and/or high ground-covering ability, they require a comparatively large space to move. In this study, a small cylindrical crawler robot inspired by amoeba locomotion, which does not need large space to move and which has high ground-covering ability, is proposed. In addition, we developed a prototype of the large intestine endoscope using the proposed crawler mechanism. Experiments have demonstrated smooth operation and a forward movement of the robot by application of voltage to the motor. This paper reports the structure, drive mechanism, prototype, and experimental evaluation.Keywords: tracked-crawler, endoscopic robot, narrow path, amoeba locomotion.
Procedia PDF Downloads 3828570 Incorporation of Copper for Performance Enhancement in Metal-Oxides Resistive Switching Device and Its Potential Electronic Application
Authors: B. Pavan Kumar Reddy, P. Michael Preetam Raj, Souri Banerjee, Souvik Kundu
Abstract:
In this work, the fabrication and characterization of copper-doped zinc oxide (Cu:ZnO) based memristor devices with aluminum (Al) and indium tin oxide (ITO) metal electrodes are reported. The thin films of Cu:ZnO was synthesized using low-cost and low-temperature chemical process. The Cu:ZnO was then deposited onto ITO bottom electrodes using spin-coater technique, whereas the top electrode Al was deposited utilizing physical vapor evaporation technique. Ellipsometer was employed in order to measure the Cu:ZnO thickness and it was found to be 50 nm. Several surface and materials characterization techniques were used to study the thin-film properties of Cu:ZnO. To ascertain the efficacy of Cu:ZnO for memristor applications, electrical characterizations such as current-voltage (I-V), data retention and endurance were obtained, all being the critical parameters for next-generation memory. The I-V characteristic exhibits switching behavior with asymmetrical hysteresis loops. This work imputes the resistance switching to the positional drift of oxygen vacancies associated with respect to the Al/Cu:ZnO junction. Further, a non-linear curve fitting regression techniques were utilized to determine the equivalent circuit for the fabricated Cu:ZnO memristors. Efforts were also devoted in order to establish its potentiality for different electronic applications.Keywords: copper doped, metal-oxides, oxygen vacancies, resistive switching
Procedia PDF Downloads 1618569 Evaluation and Assessment of Bioinformatics Methods and Their Applications
Authors: Fatemeh Nokhodchi Bonab
Abstract:
Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.Keywords: methods, applications, transcriptional regulatory systems, techniques
Procedia PDF Downloads 125