Search results for: dispensing errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 981

Search results for: dispensing errors

321 Comparative Evaluation of Pharmacologically Guided Approaches (PGA) to Determine Maximum Recommended Starting Dose (MRSD) of Monoclonal Antibodies for First Clinical Trial

Authors: Ibraheem Husain, Abul Kalam Najmi, Karishma Chester

Abstract:

First-in-human (FIH) studies are a critical step in clinical development of any molecule that has shown therapeutic promise in preclinical evaluations, since preclinical research and safety studies into clinical development is a crucial step for successful development of monoclonal antibodies for guidance in pharmaceutical industry for the treatment of human diseases. Therefore, comparison between USFDA and nine pharmacologically guided approaches (PGA) (simple allometry, maximum life span potential, brain weight, rule of exponent (ROE), two species methods and one species methods) were made to determine maximum recommended starting dose (MRSD) for first in human clinical trials using four drugs namely Denosumab, Bevacizumab, Anakinra and Omalizumab. In our study, the predicted pharmacokinetic (pk) parameters and the estimated first-in-human dose of antibodies were compared with the observed human values. The study indicated that the clearance and volume of distribution of antibodies can be predicted with reasonable accuracy in human and a good estimate of first human dose can be obtained from the predicted human clearance and volume of distribution. A pictorial method evaluation chart was also developed based on fold errors for simultaneous evaluation of various methods.

Keywords: clinical pharmacology (CPH), clinical research (CRE), clinical trials (CTR), maximum recommended starting dose (MRSD), clearance and volume of distribution

Procedia PDF Downloads 374
320 A Gradient Orientation Based Efficient Linear Interpolation Method

Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar

Abstract:

This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.

Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing

Procedia PDF Downloads 260
319 Automated User Story Driven Approach for Web-Based Functional Testing

Authors: Mahawish Masud, Muhammad Iqbal, M. U. Khan, Farooque Azam

Abstract:

Manual writing of test cases from functional requirements is a time-consuming task. Such test cases are not only difficult to write but are also challenging to maintain. Test cases can be drawn from the functional requirements that are expressed in natural language. However, manual test case generation is inefficient and subject to errors.  In this paper, we have presented a systematic procedure that could automatically derive test cases from user stories. The user stories are specified in a restricted natural language using a well-defined template.  We have also presented a detailed methodology for writing our test ready user stories. Our tool “Test-o-Matic” automatically generates the test cases by processing the restricted user stories. The generated test cases are executed by using open source Selenium IDE.  We evaluate our approach on a case study, which is an open source web based application. Effectiveness of our approach is evaluated by seeding faults in the open source case study using known mutation operators.  Results show that the test case generation from restricted user stories is a viable approach for automated testing of web applications.

Keywords: automated testing, natural language, restricted user story modeling, software engineering, software testing, test case specification, transformation and automation, user story, web application testing

Procedia PDF Downloads 387
318 Automatic Reporting System for Transcriptome Indel Identification and Annotation Based on Snapshot of Next-Generation Sequencing Reads Alignment

Authors: Shuo Mu, Guangzhi Jiang, Jinsa Chen

Abstract:

The analysis of Indel for RNA sequencing of clinical samples is easily affected by sequencing experiment errors and software selection. In order to improve the efficiency and accuracy of analysis, we developed an automatic reporting system for Indel recognition and annotation based on image snapshot of transcriptome reads alignment. This system includes sequence local-assembly and realignment, target point snapshot, and image-based recognition processes. We integrated high-confidence Indel dataset from several known databases as a training set to improve the accuracy of image processing and added a bioinformatical processing module to annotate and filter Indel artifacts. Subsequently, the system will automatically generate data, including data quality levels and images results report. Sanger sequencing verification of the reference Indel mutation of cell line NA12878 showed that the process can achieve 83% sensitivity and 96% specificity. Analysis of the collected clinical samples showed that the interpretation accuracy of the process was equivalent to that of manual inspection, and the processing efficiency showed a significant improvement. This work shows the feasibility of accurate Indel analysis of clinical next-generation sequencing (NGS) transcriptome. This result may be useful for RNA study for clinical samples with microsatellite instability in immunotherapy in the future.

Keywords: automatic reporting, indel, next-generation sequencing, NGS, transcriptome

Procedia PDF Downloads 191
317 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller

Authors: Alireza Dantism

Abstract:

Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.

Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller

Procedia PDF Downloads 97
316 Machine Learning Models for the Prediction of Heating and Cooling Loads of a Residential Building

Authors: Aaditya U. Jhamb

Abstract:

Due to the current energy crisis that many countries are battling, energy-efficient buildings are the subject of extensive research in the modern technological era because of growing worries about energy consumption and its effects on the environment. The paper explores 8 factors that help determine energy efficiency for a building: (relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area, and glazing area distribution), with Tsanas and Xifara providing a dataset. The data set employed 768 different residential building models to anticipate heating and cooling loads with a low mean squared error. By optimizing these characteristics, machine learning algorithms may assess and properly forecast a building's heating and cooling loads, lowering energy usage while increasing the quality of people's lives. As a result, the paper studied the magnitude of the correlation between these input factors and the two output variables using various statistical methods of analysis after determining which input variable was most closely associated with the output loads. The most conclusive model was the Decision Tree Regressor, which had a mean squared error of 0.258, whilst the least definitive model was the Isotonic Regressor, which had a mean squared error of 21.68. This paper also investigated the KNN Regressor and the Linear Regression, which had to mean squared errors of 3.349 and 18.141, respectively. In conclusion, the model, given the 8 input variables, was able to predict the heating and cooling loads of a residential building accurately and precisely.

Keywords: energy efficient buildings, heating load, cooling load, machine learning models

Procedia PDF Downloads 95
315 Challenges in Translating Malay Idiomatic Expressions: A Study

Authors: Nor Ruba’Yah Binti Abd Rahim, Norsyahidah Binti Jaafar

Abstract:

Translating Malay idiomatic expressions into other languages presents unique challenges due to the deep cultural nuances and linguistic intricacies embedded within these expressions. This study examined these challenges through a two-pronged methodology: a comparative analysis using survey questionnaires and a quiz administered to 50 semester 6 students who are taking Translation 1 course, and in-depth interviews with their lecturers. The survey aimed to capture students’ experiences and difficulties in translating selected Malay idioms into English, highlighting common errors and misunderstandings. Complementing this, interviews with lecturers provided expert insights into the nuances of these expressions and effective translation strategies. The findings revealed that literal translations often fail to convey the intended meanings, underscoring the importance of cultural competence and contextual awareness. The study also identified key factors that contribute to successful translations, such as the translator’s familiarity with both source and target cultures and their ability to adapt expressions creatively. This research contributed to the field of translation studies by offering practical recommendations for improving the translation of idiomatic expressions, thereby enhancing cross-cultural communication. The insights gained from this study are valuable for translators, educators, and students, emphasizing the need for a nuanced approach that respects the cultural richness of the source language while ensuring clarity in the target language.

Keywords: idiomatic expressions, cultural competence, translation strategies, cross-cultural communication, students’ difficulties

Procedia PDF Downloads 12
314 Artificial Intelligence in the Design of a Retaining Structure

Authors: Kelvin Lo

Abstract:

Nowadays, numerical modelling in geotechnical engineering is very common but sophisticated. Many advanced input settings and considerable computational efforts are required to optimize the design to reduce the construction cost. To optimize a design, it usually requires huge numerical models. If the optimization is conducted manually, there is a potentially dangerous consequence from human errors, and the time spent on the input and data extraction from output is significant. This paper presents an automation process introduced to numerical modelling (Plaxis 2D) of a trench excavation supported by a secant-pile retaining structure for a top-down tunnel project. Python code is adopted to control the process, and numerical modelling is conducted automatically in every 20m chainage along the 200m tunnel, with maximum retained height occurring in the middle chainage. Python code continuously changes the geological stratum and excavation depth under groundwater flow conditions in each 20m section. It automatically conducts trial and error to determine the required pile length and the use of props to achieve the required factor of safety and target displacement. Once the bending moment of the pile exceeds its capacity, it will increase in size. When the pile embedment reaches the default maximum length, it will turn on the prop system. Results showed that it saves time, increases efficiency, lowers design costs, and replaces human labor to minimize error.

Keywords: automation, numerical modelling, Python, retaining structures

Procedia PDF Downloads 51
313 Analytical Development of a Failure Limit and Iso-Uplift Curves for Eccentrically Loaded Shallow Foundations

Authors: N. Abbas, S. Lagomarsino, S. Cattari

Abstract:

Examining existing experimental results for shallow rigid foundations subjected to vertical centric load (N), accompanied or not with a bending moment (M), two main non-linear mechanisms governing the cyclic ‎response of the soil-foundation system can be distinguished: foundation uplift and soil yielding. A soil-foundation failure limit, is defined as a domain of resistance in the two dimensional (2D) load space (N, M) inside of which lie all the admissible combinations of loads; these latter correspond to a pure elastic, non-linear elastic or plastic behavior of the soil-foundation system, while the points lying on the failure limit correspond to a combination of loads leading to a failure of the soil-foundation system. In this study, the proposed resistance domain is constructed analytically based on mechanics. Original elastic limit, uplift initiation ‎limit and iso-uplift limits are constructed inside this domain. These limits give a prediction ‎of the mechanisms activated for each combination of loads applied to the ‎foundation. A comparison of the proposed failure limit with experimental tests existing in the literature shows interesting results. Also, the developed uplift initiation limit and iso-uplift curves are confronted with others already proposed in the literature and widely used due to the absence of other alternatives, and remarkable differences are noted, showing evident errors in the past proposals and relevant accuracy for those given in the present work.

Keywords: foundation uplift, iso-uplift curves, resistance domain, soil yield

Procedia PDF Downloads 383
312 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm

Authors: Muhammad Bilal, Zhongfeng Qiu

Abstract:

Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.

Keywords: AEORNET, AOD, SARA, GOCI, Beijing

Procedia PDF Downloads 171
311 Ambivalence as Ethical Practice: Methodologies to Address Noise, Bias in Care, and Contact Evaluations

Authors: Anthony Townsend, Robyn Fasser

Abstract:

While complete objectivity is a desirable scientific position from which to conduct a care and contact evaluation (CCE), it is precisely the recognition that we are inherently incapable of operating objectively that is the foundation of ethical practice and skilled assessment. Drawing upon recent research from Daniel Kahneman (2021) on the differences between noise and bias, as well as different inherent biases collectively termed “The Elephant in the Brain” by Kevin Simler and Robin Hanson (2019) from Oxford University, this presentation addresses both the various ways in which our judgments, perceptions and even procedures can be distorted and contaminated while conducting a CCE, but also considers the value of second order cybernetics and the psychodynamic concept of ‘ambivalence’ as a conceptual basis to inform our assessment methodologies to limit such errors or at least better identify them. Both a conceptual framework for ambivalence, our higher-order capacity to allow for the convergence and consideration of multiple emotional experiences and cognitive perceptions to inform our reasoning, and a practical methodology for assessment relying on data triangulation, Bayesian inference and hypothesis testing is presented as a means of promoting ethical practice for health care professionals conducting CCEs. An emphasis on widening awareness and perspective, limiting ‘splitting’, is demonstrated both in how this form of emotional processing plays out in alienating dynamics in families as well as the assessment thereof. In addressing this concept, this presentation aims to illuminate the value of ambivalence as foundational to ethical practice for assessors.

Keywords: ambivalence, forensic, psychology, noise, bias, ethics

Procedia PDF Downloads 86
310 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: computer-aided system, detection, image segmentation, morphology

Procedia PDF Downloads 150
309 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter

Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri

Abstract:

Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.

Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion

Procedia PDF Downloads 696
308 Internal Audit and the Effectiveness and Efficiency of Operations in Hospitals

Authors: Naziru Suleiman

Abstract:

The ever increasing cases of financial frauds and corporate accounting scandals in recent years have raised more concern on the operation of internal control mechanisms and performance of the internal audit departments in organizations. In most cases the seeming presence of both the internal control system and internal audit in organizations do not prove useful as frauds errors and irregularities are being perpetuated. The aim of this study, therefore, is to assess the role of internal audit in achieving the objectives of internal control system of federal hospitals in Kano State from the perception of the respondents. The study used survey research design and generated data from primary source by means of questionnaire. A total number of 100 copies of questionnaire were administered out of which 68 were duly completed and returned. Cronbach’s alpha was used to test the internal validity of the various items in the constructs. Descriptive statistics, chi-square test, Mann Whitney U test and Kruskal Wallis ANOVA were employed for the analysis of data. The study finds that from the perception of the respondents, internal audit departments in Federal Hospitals in Kano State are effective and that they contribute positively to the overall attainment of the objectives of internal control system of these hospitals. There is no significant difference found on the views of the respondents from the three hospitals. Hence, the study concludes that strong and functional internal audit department is a basic requirement for effectiveness of operations of the internal control system. In the light of the findings, it is recommended that internal audit should continue to ensure that the objectives of internal control system of these hospitals are achieved through proper and adequate evaluation and review of the system.

Keywords: internal audit, internal control, federal hospitals, financial frauds

Procedia PDF Downloads 353
307 Design of an Improved Distributed Framework for Intrusion Detection System Based on Artificial Immune System and Neural Network

Authors: Yulin Rao, Zhixuan Li, Burra Venkata Durga Kumar

Abstract:

Intrusion detection refers to monitoring the actions of internal and external intruders on the system and detecting the behaviours that violate security policies in real-time. In intrusion detection, there has been much discussion about the application of neural network technology and artificial immune system (AIS). However, many solutions use static methods (signature-based and stateful protocol analysis) or centralized intrusion detection systems (CIDS), which are unsuitable for real-time intrusion detection systems that need to process large amounts of data and detect unknown intrusions. This article proposes a framework for a distributed intrusion detection system (DIDS) with multi-agents based on the concept of AIS and neural network technology to detect anomalies and intrusions. In this framework, multiple agents are assigned to each host and work together, improving the system's detection efficiency and robustness. The trainer agent in the central server of the framework uses the artificial neural network (ANN) rather than the negative selection algorithm of AIS to generate mature detectors. Mature detectors can distinguish between self-files and non-self-files after learning. Our analyzer agents use genetic algorithms to generate memory cell detectors. This kind of detector will effectively reduce false positive and false negative errors and act quickly on known intrusions.

Keywords: artificial immune system, distributed artificial intelligence, multi-agent, intrusion detection system, neural network

Procedia PDF Downloads 109
306 Corrective Feedback and Uptake Patterns in English Speaking Lessons at Hanoi Law University

Authors: Nhac Thanh Huong

Abstract:

New teaching methods have led to the changes in the teachers’ roles in an English class, in which teachers’ error correction is an integral part. Language error and corrective feedback have been the interest of many researchers in foreign language teaching. However, the techniques and the effectiveness of teachers’ feedback have been a question of much controversy. This present case study has been carried out with a view to finding out the patterns of teachers’ corrective feedback and their impact on students’ uptake in English speaking lessons of legal English major students at Hanoi Law University. In order to achieve those aims, the study makes use of classroom observations as the main method of data collection to seeks answers to the two following questions: 1. What patterns of corrective feedback occur in English speaking lessons for second- year legal English major students in Hanoi Law University?; 2. To what extent does that corrective feedback lead to students’ uptake? The study provided some important findings, among which was a close relationship between corrective feedback and uptake. In particular, recast was the most commonly used feedback type, yet it was the least effective in terms of students’ uptake and repair, while the most successful feedback, namely meta-linguistic feedback, clarification requests and elicitation, which led to students’ generated repair, was used at a much lower rate by teachers. Furthermore, it revealed that different types of errors needed different types of feedback. Also, the use of feedback depended on the students’ English proficiency level. In the light of findings, a number of pedagogical implications have been drawn in the hope of enhancing the effectiveness of teachers’ corrective feedback to students’ uptake in foreign language acquisition process.

Keywords: corrective feedback, error, uptake, speaking English lesson

Procedia PDF Downloads 262
305 Approaching In vivo Dosimetry for Kilovoltage X-Ray Radiotherapy

Authors: Rodolfo Alfonso, David Alonso, Albin Garcia, Jose Luis Alonso

Abstract:

Recently a new kilovoltage radiotherapy unit model Xstrahl 200 - donated to the INOR´s Department of Radiotherapy (DR-INOR) in the framework of a IAEA's technical cooperation project- has been commissioned. This unit is able to treat shallow and low deep laying lesions, as it provides 8 discrete beam qualities, from 40 to 200 kV. As part of the patient-specific quality assurance program established at DR-INOR for external beam radiotherapy, it has been recommended to implement in vivo dose measurements (IVD), as they allow effectively discovering eventual errors or failures in the radiotherapy process. For that purpose a radio-photoluminescence (RPL) dosimetry system, model XXX, -also donated to DR-INOR by the same IAEA project- has been studied and commissioned. Main dosimetric parameters of the RPL system, such as reproducibility, linearity, and filed size influence were assessed. In a similar way, the response of radiochromic EBT3 type film was investigated for purposes of IVD. Both systems were calibrated in terms of entrance surface dose. Results of the dosimetric commissioning of RPL and EBT3 for IVD, and their pre-clinical implementation through end-to-end test cases are presented. The RPL dosimetry seems more recommendable for hyper-fractionated schemes with larger fields and curved patient contours, as those in chest wall irradiations, where the use of more than one dosimeter could be required. The radiochromic system involves smaller corrections with field size, but it sensibility is lower; hence it is more adequate for hypo-fractionated treatments with smaller fields.

Keywords: glass dosimetry, in vivo dosimetry, kilovotage radiotherapy, radiochromic dosimetry

Procedia PDF Downloads 398
304 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: block matching, digital evidence, hash list, evaluation of digital evidence

Procedia PDF Downloads 255
303 Mobile Augmented Reality for Collaboration in Operation

Authors: Chong-Yang Qiao

Abstract:

Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.

Keywords: mobile augmented reality, remote collaboration, user experience, cognition model

Procedia PDF Downloads 197
302 The Mediating Role of Masculine Gender Role Stress on the Relationship between the EFL learners’ Self-Disclosure and English Class Anxiety

Authors: Muhammed Kök & Adem Kantar

Abstract:

Learning a foreign language can be affected by various factors such as age, aptitude, motivation, L2 disposition, etc. Among these factors, masculine gender roles stress (MGRS) that male learners possess is the least touched area that has been examined so far.MGRS can be defined as the traditional male role stress when the male learners feel the masculinity threat against their traditionally adopted masculinity norms. Traditional masculine norms include toughness, accuracy, completeness, and faultlessness. From this perspective, these norms are diametrically opposed to the language learning process since learning a language, by its nature, involves stages such as making mistakes and errors, not recalling words, pronouncing sounds incorrectly, creating wrong sentences, etc. Considering the potential impact of MGRS on the language learning process, the main purpose of this study is to investigate the mediating role of MGRS on the relationship between the EFL learners’ self-disclosure and English class anxiety. Data were collected from Turkish EFL learners (N=282) who study different majors in various state universities across Turkey. Data were analyzed by means of the Bootstraping method using the SPSS Process Macro plugin. The findings show that the indirect effect of self-disclosure level on the English Class Anxiety via MGRS was significant. We conclude that one of the reasons why Turkish EFL learners have English class anxiety might be the pressure that they feel because of their traditional gender role stress.

Keywords: masculine, gender role stress, english class anxiety, self-disclosure, masculinity norms

Procedia PDF Downloads 98
301 Development and Optimization of German Diagnostical Tests in Mathematics for Vocational Training

Authors: J. Thiele

Abstract:

Teachers working at vocational Colleges are often confronted with the problem, that many students graduated from different schools and therefore each had a different education. Especially in mathematics many students lack fundamentals or had different priorities at their previous schools. Furthermore, these vocational Colleges have to provide Graduations for many different working-fields, with different core themes. The Colleges are interested in measuring the different Education levels of their students and providing assistance for those who need to catch up. The Project mathe-meistern was initiated to remedy this problem at vocational Colleges. For this purpose, online-tests were developed. The aim of these tests is to evaluate basic mathematical abilities of the students. The tests are online Multiple-Choice-Tests with a total of 65 Items. They are accessed online with a unique Transaction-Number (TAN) for each participant. The content is divided in several Categories (Arithmetic, Algebra, Fractions, Geometry, etc.). After each test, the student gets a personalized summary depicting their strengths and weaknesses in mathematical Basics. Teachers can visit a special website to examine the results of their classes or single students. In total 5830 students did participate so far. For standardization and optimization purposes the tests are being evaluated, using the classic and probabilistic Test-Theory regarding Objectivity, Reliability and Validity, annually since 2015. This Paper is about the Optimization process considering the Rasch-scaling and Standardization of the tests. Additionally, current results using standardized tests will be discussed. To achieve this Competence levels and Types of errors of students attending vocational Colleges in Nordrheinwestfalen, Germany, were determined, using descriptive Data and Distractorevaluations.

Keywords: diagnostical tests in mathematics, distractor devaluation, test-optimization, test-theory

Procedia PDF Downloads 125
300 Oneness of Scriptures and Oneness of God

Authors: Shyam Sunder Gupta

Abstract:

GOD is an infinite source of knowledge. From time to time, as per the need of mankind, GOD keeps revealing, some small, selected part of HIS knowledge as WORDS, to a chosen entity whose responsibility is to function as Messenger and share WORDS, in the form of verses, with common masses. GOD has confirmed that Messenger may not understand every WORD revealed to him, and HE directs Messenger to learn from persons who have knowledge of WORDS revealed in earlier times, as some revealed content is identical and some different by design. In due course of time, Verses, as communicated orally, are collected, and edited by an individual in a planned manner or by a group of individuals and get edited unintentionally and converted in the form of Scripture. Whatever gets collected, depending on the knowledge of the Editor(s), some errors, scientific and other forms, get into Scripture. In the present world, there are three major religions: Christianity, Islam and Hinduism, accounting for more than two-thirds of the world’s population. Each of the religions has its own Scripture, namely the Bible, Quran, and Veda. Since the source of WORDS for each of these Scriptures is the same, there is ONENESS of all Scriptures. There are amazing similarities between the events described, like the flood during the time of Noah and King Satyavara. The description of the creation of man and woman is identical. Description of Last Day, categorization of human beings, identical names, etc., have remarkable similarities. Ram, the hero of Ramayana, is a common name in Hinduism and two of Jesus’ ancestors’ names were Ram and many names in the Bible are derived from Ram. Attributes of GOD are common in all Scriptures, namely, GOD is Eternal, Unborn, Immortal, Creator of Universe(s) and everything that exists within the Universe, Omnipotent, Omnipresent, Omniscient, Subtlest of all, Unchangeable, Unique, Always Works, Source of Eternal Bliss, etc. There is the Oneness of GOD.

Keywords: GOD, scriptures, oneness, WORDS, Jesus, Ram

Procedia PDF Downloads 62
299 Effect of Infill Density and Pattern on the Compressive Strength of Parts Produced by Polylactic Acid Filament Using Fused Deposition Modelling

Authors: G. K. Awari, Vishwajeet V. Ambade, S. W. Rajurkar

Abstract:

The field of additive manufacturing is growing, and discoveries are being made. 3D printing machines are also being developed to accommodate a wider range of 3D printing materials, including plastics, metals (metal AM powders), composites, filaments, and other materials. There are numerous printing materials available for industrial additive manufacturing. Such materials have their unique characteristics, advantages, and disadvantages. In order to avoid errors in additive manufacturing, key elements such as 3D printing material type, texture, cost, printing technique and procedure, and so on must be examined. It can be complex to select the best material for a particular job. Polylactic acid (PLA) is made from sugar cane or cornstarch, both of which are renewable resources. "Black plastic" is another name for it. Because it is safe to use and print, it is frequently used in primary and secondary schools. This is also how FDM screen printing is done. PLA is simple to print because of its low warping impact. It's also possible to print it on a cold surface. When opposed to ABS, it allows for sharper edges and features to be printed. This material comes in a wide range of colours. Polylactic acid (PLA) is the most common material used in fused deposition modelling (FDM). PLA can be used to print a wide range of components, including medical implants, household items, and mechanical parts. The mechanical behaviour of the printed item is affected by variations in infill patterns that are subjected to compressive tests in the current investigation to examine their behaviour under compressive stresses.

Keywords: fused deposition modelling, polylactic acid, infill density, infill pattern, compressive strength

Procedia PDF Downloads 73
298 Management of Fitness-For-Duty for Human Error Prevention in Nuclear Power Plants

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For the past several decades, not a few researchers have warned that even a trivial human error may result in unexpected accidents, especially in Nuclear Power Plants. To prevent accidents in Nuclear Power Plants, it is quite indispensable to make any factors under the effective control that may raise the possibility of human errors for accident prevention. This study aimed to develop a risk management program, especially in the sense that guaranteeing Fitness-for-Duty (FFD) of human beings working in Nuclear Power Plants. Throughout a literal survey, it was found that work stress and fatigue are major psychophysical factors requiring sophisticated management. A set of major management factors related to work stress and fatigue was through repetitive literal surveys and classified into several categories. To maintain the fitness of human workers, a 4-level – individual worker, team, staff within plants, and external professional - approach was adopted for FFD management program. Moreover, the program was arranged to envelop the whole employment cycle from selection and screening of workers, job allocation, and job rotation. Also, a managerial care program was introduced for employee assistance based on the concept of Employee Assistance Program (EAP). The developed program was reviewed with repetition by ex-operators in nuclear power plants, and assessed in the affirmative. As a whole, responses implied additional treatment to guarantee high performance of human workers not only in normal operations but also in emergency situations. Consequently, the program is under administrative modification for practical application.

Keywords: fitness-for-duty (FFD), human error, work stress, fatigue, Employee-Assistance-Program (EAP)

Procedia PDF Downloads 302
297 Developing Oral Communication Competence in a Second Language: The Communicative Approach

Authors: Ikechi Gilbert

Abstract:

Oral communication is the transmission of ideas or messages through the speech process. Acquiring competence in this area which, by its volatile nature, is prone to errors and inaccuracies would require the adoption of a well-suited teaching methodology. Efficient oral communication facilitates exchange of ideas and easy accomplishment of day-to-day tasks, by means of a demonstrated mastery of oral expression and the making of fine presentations to audiences or individuals while recognizing verbal signals and body language of others and interpreting them correctly. In Anglophone states such as Nigeria, Ghana, etc., the French language, for instance, is studied as a foreign language, being used majorly in teaching learners who have their own mother tongue different from French. The same applies to Francophone states where English is studied as a foreign language by people whose official language or mother tongue is different from English. The ideal approach would be to teach these languages in these environments through a pedagogical approach that properly takes care of the oral perspective for effective understanding and application by the learners. In this article, we are examining the communicative approach as a methodology for teaching oral communication in a foreign language. This method is a direct response to the communicative needs of the learner involving the use of appropriate materials and teaching techniques that meet those needs. It is also a vivid improvement to the traditional grammatical and audio-visual adaptations. Our contribution will focus on the pedagogical component of oral communication improvement, highlighting its merits and also proposing diverse techniques including aspects of information and communication technology that would assist the second language learner communicate better orally.

Keywords: communication, competence, methodology, pedagogical component

Procedia PDF Downloads 266
296 Building Collapse: Factors and Resisting Mechanisms: A Review of Case Studies

Authors: Genevieve D. Fernandes, Nisha P. Naik

Abstract:

All through the ages in all human civilizations, men have been engaged in construction activity, not only to build their dwellings and house their activities, but also roads, bridges to facilitate means of transport, and communication etc. The main concern in this activity was to ensure safety and reduce the collapse of the buildings and other structures. But even after taking all precautions, it is impossible to guarantee safety and collapse because of several unforeseen reasons like faulty constructions, design errors, overloading, soil liquefaction, gas explosion, material degradation, terrorist attacks and economic factors also contributing to the collapse. It is also uneconomical to design the structure for unforeseen events unless they have a reasonable chance of occurrence. In order to ensure safety and prevent collapse, many guidelines have been framed by local bodies and government authorities in many countries like the United States Department of Defence (DOD), United States General Service Administration (GSA) and Euro-Codes in European Nations. Some other practices are followed to incorporate redundancies in the structure like detailing, ductile designs, tying of elements at particular locations, and provision of hinges and interconnections. It is also to be admitted that a full-proof safe design structure for accidental events cannot be prepared and implemented as it is uneconomical and the chances of such occurrences are less. This paper reviews past case studies of the collapse of structures with the aim of developing an understanding of the collapse mechanism. This study will definitely help to bring about a detailed improvement in the design to maximise the quality of the construction at a minimal cost.

Keywords: unforeseen factors, progressive collapse, collapse resisting mechanisms, column removal scenario

Procedia PDF Downloads 137
295 The Influence of Air Temperature Controls in Estimation of Air Temperature over Homogeneous Terrain

Authors: Fariza Yunus, Jasmee Jaafar, Zamalia Mahmud, Nurul Nisa’ Khairul Azmi, Nursalleh K. Chang, Nursalleh K. Chang

Abstract:

Variation of air temperature from one place to another is cause by air temperature controls. In general, the most important control of air temperature is elevation. Another significant independent variable in estimating air temperature is the location of meteorological stations. Distances to coastline and land use type are also contributed to significant variations in the air temperature. On the other hand, in homogeneous terrain direct interpolation of discrete points of air temperature work well to estimate air temperature values in un-sampled area. In this process the estimation is solely based on discrete points of air temperature. However, this study presents that air temperature controls also play significant roles in estimating air temperature over homogenous terrain of Peninsular Malaysia. An Inverse Distance Weighting (IDW) interpolation technique was adopted to generate continuous data of air temperature. This study compared two different datasets, observed mean monthly data of T, and estimation error of T–T’, where T’ estimated value from a multiple regression model. The multiple regression model considered eight independent variables of elevation, latitude, longitude, coastline, and four land use types of water bodies, forest, agriculture and build up areas, to represent the role of air temperature controls. Cross validation analysis was conducted to review accuracy of the estimation values. Final results show, estimation values of T–T’ produced lower errors for mean monthly mean air temperature over homogeneous terrain in Peninsular Malaysia.

Keywords: air temperature control, interpolation analysis, peninsular Malaysia, regression model, air temperature

Procedia PDF Downloads 374
294 Development of an Interactive Display-Control Layout Design System for Trains Based on Train Drivers’ Mental Models

Authors: Hyeonkyeong Yang, Minseok Son, Taekbeom Yoo, Woojin Park

Abstract:

Human error is the most salient contributing factor to railway accidents. To reduce the frequency of human errors, many researchers and train designers have adopted ergonomic design principles for designing display-control layout in rail cab. There exist a number of approaches for designing the display control layout based on optimization methods. However, the ergonomically optimized layout design may not be the best design for train drivers, since the drivers have their own mental models based on their experiences. Consequently, the drivers may prefer the existing display-control layout design over the optimal design, and even show better driving performance using the existing design compared to that using the optimal design. Thus, in addition to ergonomic design principles, train drivers’ mental models also need to be considered for designing display-control layout in rail cab. This paper developed an ergonomic assessment system of display-control layout design, and an interactive layout design system that can generate design alternatives and calculate ergonomic assessment score in real-time. The design alternatives generated from the interactive layout design system may not include the optimal design from the ergonomics point of view. However, the system’s strength is that it considers train drivers’ mental models, which can help generate alternatives that are more friendly and easier to use for train drivers. Also, with the developed system, non-experts in ergonomics, such as train drivers, can refine the design alternatives and improve ergonomic assessment score in real-time.

Keywords: display-control layout design, interactive layout design system, mental model, train drivers

Procedia PDF Downloads 306
293 Calculation of the Normalized Difference Vegetation Index and the Spectral Signature of Coffee Crops: Benefits of Image Filtering on Mixed Crops

Authors: Catalina Albornoz, Giacomo Barbieri

Abstract:

Crop monitoring has shown to reduce vulnerability to spreading plagues and pathologies in crops. Remote sensing with Unmanned Aerial Vehicles (UAVs) has made crop monitoring more precise, cost-efficient and accessible. Nowadays, remote monitoring involves calculating maps of vegetation indices by using different software that takes either Truecolor (RGB) or multispectral images as an input. These maps are then used to segment the crop into management zones. Finally, knowing the spectral signature of a crop (the reflected radiation as a function of wavelength) can be used as an input for decision-making and crop characterization. The calculation of vegetation indices using software such as Pix4D has high precision for monoculture plantations. However, this paper shows that using this software on mixed crops may lead to errors resulting in an incorrect segmentation of the field. Within this work, authors propose to filter all the elements different from the main crop before the calculation of vegetation indices and the spectral signature. A filter based on the Sobel method for border detection is used for filtering a coffee crop. Results show that segmentation into management zones changes with respect to the traditional situation in which a filter is not applied. In particular, it is shown how the values of the spectral signature change in up to 17% per spectral band. Future work will quantify the benefits of filtering through the comparison between in situ measurements and the calculated vegetation indices obtained through remote sensing.

Keywords: coffee, filtering, mixed crop, precision agriculture, remote sensing, spectral signature

Procedia PDF Downloads 388
292 Basic Study on a Thermal Model for Evaluating The Environment of Infant Facilities

Authors: Xin Yuan, Yuji Ryu

Abstract:

The indoor environment has a significant impact on occupants and a suitable indoor thermal environment can improve the children’s physical health and study efficiency during school hours. In this study, we explored the thermal environment in infant facilities classrooms for infants and children aged 1-5 and evaluated their thermal comfort. An infant facility in Fukuoka, Japan was selected for a case study to capture the infant and children’s thermal comfort characteristics in summer and winter from August 2019 to February 2020. Previous studies have pointed out using PMV indices to evaluate the thermal comfort for children could create errors that may lead to misleading results. Thus, to grasp the actual thermal environment and thermal comfort characteristics of infants and children, we retrieved the operative temperature of each child through the thermal model, based on the sensible heat transfer from the skin to the environment, and the measured classroom indoor temperature, relative humidity, and pocket temperature of children’s shorts. The statistical and comparative analysis of the results shows that (1) the operative temperature showed a large individual difference among children, with the maximum reached 6.25 °C. (2) The children might feel slightly cold in the classrooms in summer, with the frequencies of operative temperature within the interval of 26-28 ºC were only 5.33% and 16.6% for children respectively. (3) The thermal environment around children is more complicated in winter the operative temperature could exceed or fail to reach the thermal comfort temperature zone (20-23 ºC interval). (4) The environmental conditions surrounding the children may account for the reduction of their thermal comfort. The findings contribute to improving the understanding of the infant and children’s thermal comfort and provide valuable information for designers and governments to develop effective strategies for the indoor thermal environment considering the perspective of children.

Keywords: infant and children, thermal environment, thermal model, operative temperature.

Procedia PDF Downloads 119