Search results for: tool validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6155

Search results for: tool validation

2795 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets

Procedia PDF Downloads 169
2794 Optimal Design of Propellant Grain Shape Based on Structural Strength Analysis

Authors: Chen Xiong, Tong Xin, Li Hao, Xu Jin-Sheng

Abstract:

Experiment and simulation researches on the structural integrity of propellant grain in solid rocket motor (SRM) with high volumetric fraction were conducted. First, by using SRM parametric modeling functions with secondary development tool Python of ABAQUS, the three dimensional parameterized modeling programs of star shaped grain, wheel shaped grain and wing cylindrical grain were accomplished. Then, the mechanical properties under different loads for star shaped grain were obtained with the application of automatically established finite element model in ABAQUS. Next, several optimization algorithms are introduced to optimize the star shaped grain, wheel shaped grain and wing cylindrical grain. After meeting the demands of burning surface changes and volumetric fraction, the optimum three dimensional shapes of grain were obtained. Finally, by means of parametric modeling functions, pressure data of SRM’s cold pressurization test was directly applied to simulation of grain in terms of mechanical performance. The results verify the reliability and practical of parameterized modeling program of SRM.

Keywords: cold pressurization test, ğarametric modeling, structural integrity, propellant grain, SRM

Procedia PDF Downloads 362
2793 A Robust Optimization of Chassis Durability/Comfort Compromise Using Chebyshev Polynomial Chaos Expansion Method

Authors: Hanwei Gao, Louis Jezequel, Eric Cabrol, Bernard Vitry

Abstract:

The chassis system is composed of complex elements that take up all the loads from the tire-ground contact area and thus it plays an important role in numerous specifications such as durability, comfort, crash, etc. During the development of new vehicle projects in Renault, durability validation is always the main focus while deployment of comfort comes later in the project. Therefore, sometimes design choices have to be reconsidered because of the natural incompatibility between these two specifications. Besides, robustness is also an important point of concern as it is related to manufacturing costs as well as the performance after the ageing of components like shock absorbers. In this paper an approach is proposed aiming to realize a multi-objective optimization between chassis endurance and comfort while taking the random factors into consideration. The adaptive-sparse polynomial chaos expansion method (PCE) with Chebyshev polynomial series has been applied to predict responses’ uncertainty intervals of a system according to its uncertain-but-bounded parameters. The approach can be divided into three steps. First an initial design of experiments is realized to build the response surfaces which represent statistically a black-box system. Secondly within several iterations an optimum set is proposed and validated which will form a Pareto front. At the same time the robustness of each response, served as additional objectives, is calculated from the pre-defined parameter intervals and the response surfaces obtained in the first step. Finally an inverse strategy is carried out to determine the parameters’ tolerance combination with a maximally acceptable degradation of the responses in terms of manufacturing costs. A quarter car model has been tested as an example by applying the road excitations from the actual road measurements for both endurance and comfort calculations. One indicator based on the Basquin’s law is defined to compare the global chassis durability of different parameter settings. Another indicator related to comfort is obtained from the vertical acceleration of the sprung mass. An optimum set with best robustness has been finally obtained and the reference tests prove a good robustness prediction of Chebyshev PCE method. This example demonstrates the effectiveness and reliability of the approach, in particular its ability to save computational costs for a complex system.

Keywords: chassis durability, Chebyshev polynomials, multi-objective optimization, polynomial chaos expansion, ride comfort, robust design

Procedia PDF Downloads 152
2792 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 150
2791 The Twain Shall Meet: First Year Writing Skills in Senior Year Project Design

Authors: Sana Sayed

Abstract:

The words objectives, outcomes, and assessment are commonplace in academia. Educators, especially those who use their emotional intelligence as a useful teaching tool, strive to find creative and innovative ways to connect to their students while meeting the objectives, outcomes, and assessment measures for their respective courses. However, what happens to these outcomes once the objectives have been met, students have completed a specific course, and generic letter grades have been generated? How can their knowledge and acquired skills be assessed over the course of semesters, throughout their years of study, and until their final year right before they graduate? Considering the courses students complete for different departments in various disciplines, how can these outcomes be measured, or at least maintained, across the curriculum? This research-driven paper uses the key course outcomes of first year, required writing courses and traces them in two senior level, required civil engineering design courses at the American University of Sharjah, which is located in the United Arab Emirates. The purpose of this research is two-fold: (1) to assess specific learning outcomes using a case study that focuses on courses from two different disciplines during two very distinctive years of study, and (2) to demonstrate how learning across the curriculum fosters life-long proficiencies among graduating students that are aligned with a university’s mission statement.

Keywords: assessment, learning across the curriculum, objectives, outcomes

Procedia PDF Downloads 303
2790 A Study of the Use of English by Thai: A Case Study of English in Thai songs

Authors: Jutharat Nawarungreung

Abstract:

As an international language, English is used as a medium in formal and informal settings including all kinds of entertainment. As it were, the use of English in such an arena is of no less importance and interest, and indeed it becomes a valuable tool for EFL learners to learn and improve their language. In addition, it is a social perspective in the way that English is incorporated in other nationalities’ music, as well as the attitudes of listeners toward it. This research principally aimed to find out the level of comprehensibility of English inserted in Thai pop music. There were three groups of participants, namely Thais, non-native speakers who are non-Thai and native speakers, 35 each group. The research tools comprised song lyrics, interviews, questionnaires, and video recorder. The participants listened to Thai songs and wrote down the English words and their meanings they heard. They were video-recorded when listening to the songs, and then asked on particular actions and facial expressions. Afterwards, they were interviewed to account for their attitudes toward the incorporation of English into Thai songs. Finally, the participants completed a questionnaire. Data was analysed by the way of comparison of all the participants’ pronunciation. In doing so, the number of correct and incorrect answers was revealed. The study has shown that those who attained the highest level of understanding the English words in Thai music were Thais, native speakers, and non-native speakers who are non-Thai respectively.

Keywords: English throughout the world, varieties of English, English in Thai songs, intelligibility, attitudes

Procedia PDF Downloads 354
2789 Organizational Innovations of the 20th Century as High Tech of the 21st: Evidence from Patent Data

Authors: Valery Yakubovich, Shuping wu

Abstract:

Organization theorists have long claimed that organizational innovations are nontechnological, in part because they are unpatentable. The claim rests on the assumption that organizational innovations are abstract ideas embodied in persons and contexts rather than in context-free practical tools. However, over the last three decades, organizational knowledge has been increasingly embodied in digital tools which, in principle, can be patented. To provide the first empirical evidence regarding the patentability of organizational innovations, we trained two machine learning algorithms to identify a population of 205,434 patent applications for organizational technologies (OrgTech) and, among them, 141,285 applications that use organizational innovations accumulated over the 20th century. Our event history analysis of the probability of patenting an OrgTech invention shows that ideas from organizational innovations decrease the probability of patent allowance unless they describe a practical tool. We conclude that the present-day digital transformation places organizational innovations in the realm of high tech and turns the debate about organizational technologies into the challenge of designing practical organizational tools that embody big ideas about organizing. We outline an agenda for patent-based research on OrgTech as an emerging phenomenon.

Keywords: organizational innovation, organizational technology, high tech, patents, machine learning

Procedia PDF Downloads 122
2788 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows

Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono

Abstract:

A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.

Keywords: LES, multi-resolution, ENO, fortran

Procedia PDF Downloads 366
2787 Error Analysis of Students’ Freewriting: A Study of Adult English Learners’ Errors

Authors: Louella Nicole Gamao

Abstract:

Writing in English is accounted as a complex skill and process for foreign language learners who commit errors in writing are found as an inevitable part of language learners' writing. This study aims to explore and analyze the learners of English-as-a foreign Language (EFL) freewriting in a University in Taiwan by identifying the category of mistakes that often appear in their freewriting activity and analyzing the learners' awareness of each error. Hopefully, this present study will be able to gain further information about students' errors in their English writing that may contribute to further understanding of the benefits of freewriting activity that can be used for future purposes as a powerful tool in English writing courses for EFL classes. The present study adopted the framework of error analysis proposed by Dulay, Burt, and Krashen (1982), which consisted of a compilation of data, identification of errors, classification of error types, calculation of frequency of each error, and error interpretation. Survey questionnaires regarding students' awareness of errors were also analyzed and discussed. Using quantitative and qualitative approaches, this study provides a detailed description of the errors found in the students'freewriting output, explores the similarities and differences of the students' errors in both academic writing and freewriting, and lastly, analyzes the students' perception of their errors.

Keywords: error, EFL, freewriting, taiwan, english

Procedia PDF Downloads 108
2786 Deep Learning and Accurate Performance Measure Processes for Cyber Attack Detection among Web Logs

Authors: Noureddine Mohtaram, Jeremy Patrix, Jerome Verny

Abstract:

As an enormous number of online services have been developed into web applications, security problems based on web applications are becoming more serious now. Most intrusion detection systems rely on each request to find the cyber-attack rather than on user behavior, and these systems can only protect web applications against known vulnerabilities rather than certain zero-day attacks. In order to detect new attacks, we analyze the HTTP protocols of web servers to divide them into two categories: normal attacks and malicious attacks. On the other hand, the quality of the results obtained by deep learning (DL) in various areas of big data has given an important motivation to apply it to cybersecurity. Deep learning for attack detection in cybersecurity has the potential to be a robust tool from small transformations to new attacks due to its capability to extract more high-level features. This research aims to take a new approach, deep learning to cybersecurity, to classify these two categories to eliminate attacks and protect web servers of the defense sector which encounters different web traffic compared to other sectors (such as e-commerce, web app, etc.). The result shows that by using a machine learning method, a higher accuracy rate, and a lower false alarm detection rate can be achieved.

Keywords: anomaly detection, HTTP protocol, logs, cyber attack, deep learning

Procedia PDF Downloads 212
2785 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment

Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai

Abstract:

Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.

Keywords: computational methods, MATLAB, seismic hazard, seismic measurements

Procedia PDF Downloads 340
2784 The Relationship between Social Capital and Knowledge Sharing in the Ministry of Culture and Islamic Guidance(Iran)

Authors: Narges Sadat Myrmousavy, Maryam Eslampanah

Abstract:

The aim of this study was to investigate the relationship between social capital and knowledge sharing is the Ministry of Culture and Islamic Guidance. They are descriptive correlation study. The study sample consisted of all the experts in the Ministry of Culture and Islamic Guidance helping professionals headquarters in Tehran in the summer period is 2012, the number is 650. Random sampling is targeted. The sample size is 400. The data collection tool was a questionnaire that was used for the preparation of a standard questionnaire. They also examine the assumptions of the regression coefficient for the relationship between variables in order to investigate the main hypothesis test is used. The findings suggest that the structural and knowledge-sharing between components, there is a direct relationship. The components of the relationship between Impression management and knowledge sharing, there is a direct relationship. There was no significant relationship between Individual pro-social motives and knowledge sharing. Both components of the cognitive aspects of open mindedness and competence are directly related with knowledge sharing. Finally, the comparison between the different dimensions of social capital, the largest of its structure, and its relationship with knowledge sharing is the least relation.

Keywords: social capital, knowledge sharing, ministry of culture and Islamic guidance (Iran), open mindedness, pro-social motives

Procedia PDF Downloads 503
2783 Whole Body Cooling Hypothermia Treatment Modelling Using a Finite Element Thermoregulation Model

Authors: Ana Beatriz C. G. Silva, Luiz Carlos Wrobel, Fernando Luiz B. Ribeiro

Abstract:

This paper presents a thermoregulation model using the finite element method to perform numerical analyses of brain cooling procedures as a contribution to the investigation on the use of therapeutic hypothermia after ischemia in adults. The use of computational methods can aid clinicians to observe body temperature using different cooling methods without the need of invasive techniques, and can thus be a valuable tool to assist clinical trials simulating different cooling options that can be used for treatment. In this work, we developed a FEM package applied to the solution of the continuum bioheat Pennes equation. Blood temperature changes were considered using a blood pool approach and a lumped analysis for intravascular catheter method of blood cooling. Some analyses are performed using a three-dimensional mesh based on a complex geometry obtained from computed tomography medical images, considering a cooling blanket and a intravascular catheter. A comparison is made between the results obtained and the effects of each case in brain temperature reduction in a required time, maintenance of body temperature at moderate hypothermia levels and gradual rewarming.

Keywords: brain cooling, finite element method, hypothermia treatment, thermoregulation

Procedia PDF Downloads 311
2782 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia

Authors: Melaku Tsehay

Abstract:

The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.

Keywords: data quality, immunization, verification factor, pastoralist region

Procedia PDF Downloads 124
2781 The Acceptance of E-Assessment Considering Security Perspective: Work in Progress

Authors: Kavitha Thamadharan, Nurazean Maarop

Abstract:

The implementation of e-assessment as tool to support the process of teaching and learning in university has become a popular technological means in universities. E-Assessment provides many advantages to the users especially the flexibility in teaching and learning. The e-assessment system has the capability to improve its quality of delivering education. However, there still exists a drawback in terms of security which limits the user acceptance of the online learning system. Even though there are studies providing solutions for identified security threats in e-learning usage, there is no particular model which addresses the factors that influences the acceptance of e-assessment system by lecturers from security perspective. The aim of this study is to explore security aspects of e-assessment in regard to the acceptance of the technology. As a result a conceptual model of secure acceptance of e-assessment is proposed. Both human and security factors are considered in formulation of this conceptual model. In order to increase understanding of critical issues related to the subject of this study, interpretive approach involving convergent mixed method research method is proposed to be used to execute the research. This study will be useful in providing more insightful understanding regarding the factors that influence the user acceptance of e-assessment system from security perspective.

Keywords: secure technology acceptance, e-assessment security, e-assessment, education technology

Procedia PDF Downloads 459
2780 The Role of Inventory Classification in Supply Chain Responsiveness in a Build-to-Order and Build-To-Forecast Manufacturing Environment: A Comparative Analysis

Authors: Qamar Iqbal

Abstract:

Companies strive to improve their forecasting methods to predict the fluctuations in customer demand. These fluctuation and variation in demand affect the manufacturing operations and can limit a company’s ability to fulfill customer demand on time. Companies keep the inventory buffer and maintain the stocking levels to reduce the impact of demand variation. A mid-size company deals with thousands of stock keeping units (skus). It is neither easy and nor efficient to control and manage each sku. Inventory classification provides a tool to the management to increase their ability to support customer demand. The paper presents a framework that shows how inventory classification can play a role to increase supply chain responsiveness. A case study will be presented to further elaborate the method both for build-to-order and build-to-forecast manufacturing environments. Results will be compared that will show which manufacturing setting has advantage over another under different circumstances. The outcome of this study is very useful to the management because this will give them an insight on how inventory classification can be used to increase their ability to respond to changing customer needs.

Keywords: inventory classification, supply chain responsiveness, forecast, manufacturing environment

Procedia PDF Downloads 595
2779 Coping for Academic Women Departmental Heads during COVID-19: A Capabilities Approach Perspective

Authors: Juliet Ramohai

Abstract:

This paper explores how women departmental heads in higher education experience leadership in a time of the COVID-19 crises. The focus is mostly on their care and coping as they work in virtual spaces. Most scholars have looked at the effects and challenges that different employees face while working from home during a lockdown. However, very few take a dedicated focus on women in leadership and the coping mechanisms and resources that they use for effective leadership during this difficult time. The paper draws on two aspects of Sen’s Capabilities approach, functionings, and agency, to cast a closer understanding of the institutional and individual coping mechanisms that might be at these women's disposal. The qualitative approach used for this paper and a feminist lens provides a critical and in-depth understanding of the real-life stories of the women and how they make sense of their virtual leadership. Data for this paper was collected through semi-structured interviews with 10 women in the positions of head of departments and analysed thematically using capabilities approach concepts as an analytical tool. The findings in this paper indicate that functionings and freedoms are tightly linked to institutional ethnographies. These ethnographies might support or hamper coping for women leaders, especially during times of crisis.

Keywords: capability approach, women leaders, higher education, COVID-19

Procedia PDF Downloads 186
2778 Information Pollution: Exploratory Analysis of Subs-Saharan African Media’s Capabilities to Combat Misinformation and Disinformation

Authors: Muhammed Jamiu Mustapha, Jamiu Folarin, Stephen Obiri Agyei, Rasheed Ademola Adebiyi, Mutiu Iyanda Lasisi

Abstract:

The role of information in societal development and growth cannot be over-emphasized. It has remained an age-long strategy to adopt the information flow to make an egalitarian society. The same has become a tool for throwing society into chaos and anarchy. It has been adopted as a weapon of war and a veritable instrument of psychological warfare with a variety of uses. That is why some scholars posit that information could be deployed as a weapon to wreak “Mass Destruction" or promote “Mass Development". When used as a tool for destruction, the effect on society is like an atomic bomb which when it is released, pollutes the air and suffocates the people. Technological advancement has further exposed the latent power of information and many societies seem to be overwhelmed by its negative effect. While information remains one of the bedrock of democracy, the information ecosystem across the world is currently facing a more difficult battle than ever before due to information pluralism and technological advancement. The more the agents involved try to combat its menace, the difficult and complex it is proving to be curbed. In a region like Africa with dangling democracy enfolds with complexities of multi-religion, multi-cultures, inter-tribes, ongoing issues that are yet to be resolved, it is important to pay critical attention to the case of information disorder and find appropriate ways to curb or mitigate its effects. The media, being the middleman in the distribution of information, needs to build capacities and capabilities to separate the whiff of misinformation and disinformation from the grains of truthful data. From quasi-statistical senses, it has been observed that the efforts aimed at fighting information pollution have not considered the built resilience of media organisations against this disorder. Apparently, the efforts, resources and technologies adopted for the conception, production and spread of information pollution are much more sophisticated than approaches to suppress and even reduce its effects on society. Thus, this study seeks to interrogate the phenomenon of information pollution and the capabilities of select media organisations in Sub-Saharan Africa. In doing this, the following questions are probed; what are the media actions to curb the menace of information pollution? Which of these actions are working and how effective are they? And which of the actions are not working and why they are not working? Adopting quantitative and qualitative approaches and anchored on the Dynamic Capability Theory, the study aims at digging up insights to further understand the complexities of information pollution, media capabilities and strategic resources for managing misinformation and disinformation in the region. The quantitative approach involves surveys and the use of questionnaires to get data from journalists on their understanding of misinformation/disinformation and their capabilities to gate-keep. Case Analysis of select media and content analysis of their strategic resources to manage misinformation and disinformation is adopted in the study while the qualitative approach will involve an In-depth Interview to have a more robust analysis is also considered. The study is critical in the fight against information pollution for a number of reasons. One, it is a novel attempt to document the level of media capabilities to fight the phenomenon of information disorder. Two, the study will enable the region to have a clear understanding of the capabilities of existing media organizations to combat misinformation and disinformation in the countries that make up the region. Recommendations emanating from the study could be used to initiate, intensify or review existing approaches to combat the menace of information pollution in the region.

Keywords: disinformation, information pollution, misinformation, media capabilities, sub-Saharan Africa

Procedia PDF Downloads 161
2777 Corpus-Based Description of Core English Nouns of Pakistani English, an EFL Learner Perspective at Secondary Level

Authors: Abrar Hussain Qureshi

Abstract:

Vocabulary has been highlighted as a key indicator in any foreign language learning program, especially English as a foreign language (EFL). It is often considered a potential tool in foreign language curriculum, and its deficiency impedes successful communication in the target language. The knowledge of the lexicon is very significant in getting communicative competence and performance. Nouns constitute a considerable bulk of English vocabulary. Rather, they are the bones of the English language and are the main semantic carrier in spoken and written discourse. As nouns dominate the bulk of the English lexicon, their role becomes all the more potential. The undertaken research is a systematic effort in this regard to work out a list of highly frequent list of Pakistani English nouns for the EFL learners at the secondary level. It will encourage autonomy for the EFL learners as well as will save their time. The corpus used for the research has been developed locally from leading English newspapers of Pakistan. Wordsmith Tools has been used to process the research data and to retrieve word list of frequent Pakistani English nouns. The retrieved list of core Pakistani English nouns is supposed to be useful for English language learners at the secondary level as it covers a wide range of speech events.

Keywords: corpus, EFL, frequency list, nouns

Procedia PDF Downloads 103
2776 Systematic Evaluation of Convolutional Neural Network on Land Cover Classification from Remotely Sensed Images

Authors: Eiman Kattan, Hong Wei

Abstract:

In using Convolutional Neural Network (CNN) for classification, there is a set of hyperparameters available for the configuration purpose. This study aims to evaluate the impact of a range of parameters in CNN architecture i.e. AlexNet on land cover classification based on four remotely sensed datasets. The evaluation tests the influence of a set of hyperparameters on the classification performance. The parameters concerned are epoch values, batch size, and convolutional filter size against input image size. Thus, a set of experiments were conducted to specify the effectiveness of the selected parameters using two implementing approaches, named pertained and fine-tuned. We first explore the number of epochs under several selected batch size values (32, 64, 128 and 200). The impact of kernel size of convolutional filters (1, 3, 5, 7, 10, 15, 20, 25 and 30) was evaluated against the image size under testing (64, 96, 128, 180 and 224), which gave us insight of the relationship between the size of convolutional filters and image size. To generalise the validation, four remote sensing datasets, AID, RSD, UCMerced and RSCCN, which have different land covers and are publicly available, were used in the experiments. These datasets have a wide diversity of input data, such as number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in both training and testing. The results have shown that increasing the number of epochs leads to a higher accuracy rate, as expected. However, the convergence state is highly related to datasets. For the batch size evaluation, it has shown that a larger batch size slightly decreases the classification accuracy compared to a small batch size. For example, selecting the value 32 as the batch size on the RSCCN dataset achieves the accuracy rate of 90.34 % at the 11th epoch while decreasing the epoch value to one makes the accuracy rate drop to 74%. On the other extreme, setting an increased value of batch size to 200 decreases the accuracy rate at the 11th epoch is 86.5%, and 63% when using one epoch only. On the other hand, selecting the kernel size is loosely related to data set. From a practical point of view, the filter size 20 produces 70.4286%. The last performed image size experiment shows a dependency in the accuracy improvement. However, an expensive performance gain had been noticed. The represented conclusion opens the opportunities toward a better classification performance in various applications such as planetary remote sensing.

Keywords: CNNs, hyperparamters, remote sensing, land cover, land use

Procedia PDF Downloads 169
2775 Empirical Prediction of the Effect of Rain Drops on Dbs System Operating in Ku-Band (Case Study of Abuja)

Authors: Tonga Agadi Danladi, Ajao Wasiu Bamidele, Terdue Dyeko

Abstract:

Recent advancement in microwave communications technologies especially in telecommunications and broadcasting have resulted in congestion on the frequencies below 10GHz. This has forced microwave designers to look for high frequencies. Unfortunately for frequencies greater than 10GHz rain becomes one of the main factors of attenuation in signal strength. At frequencies from 10GHz upwards, rain drop sizes leads to outages that compromises the availability and quality of service this making it a critical factor in satellite link budget design. Rain rate and rain attenuation predictions are vital steps to be considered when designing microwave satellite communication link operating at Ku-band frequencies (112-18GHz). Unreliable rain rates data in the tropical regions of the world like Nigeria from radio communication group of the international Telecommunication Union (ITU-R) makes it difficult for microwave engineers to determine a realistic rain margin that needs to be accommodated in satellite link budget design in such region. This work presents an empirical tool for predicting the amount of signal due to rain on DBS signal operating at the Ku-band.

Keywords: attenuation, Ku-Band, microwave communication, rain rates

Procedia PDF Downloads 485
2774 One-Step Time Series Predictions with Recurrent Neural Networks

Authors: Vaidehi Iyer, Konstantin Borozdin

Abstract:

Time series prediction problems have many important practical applications, but are notoriously difficult for statistical modeling. Recently, machine learning methods have been attracted significant interest as a practical tool applied to a variety of problems, even though developments in this field tend to be semi-empirical. This paper explores application of Long Short Term Memory based Recurrent Neural Networks to the one-step prediction of time series for both trend and stochastic components. Two types of data are analyzed - daily stock prices, that are often considered to be a typical example of a random walk, - and weather patterns dominated by seasonal variations. Results from both analyses are compared, and reinforced learning framework is used to select more efficient between Recurrent Neural Networks and more traditional auto regression methods. It is shown that both methods are able to follow long-term trends and seasonal variations closely, but have difficulties with reproducing day-to-day variability. Future research directions and potential real world applications are briefly discussed.

Keywords: long short term memory, prediction methods, recurrent neural networks, reinforcement learning

Procedia PDF Downloads 229
2773 Radiomics: Approach to Enable Early Diagnosis of Non-Specific Breast Nodules in Contrast-Enhanced Magnetic Resonance Imaging

Authors: N. D'Amico, E. Grossi, B. Colombo, F. Rigiroli, M. Buscema, D. Fazzini, G. Cornalba, S. Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of nodules considered non-specific by expert radiologists, recognized in magnetic resonance mammography (MRm) with T1-weighted (T1w) sequences with paramagnetic contrast. Material and Methods: 47 cases out of 1200 undergoing MRm, in which the MRm assessment gave uncertain classification (non-specific nodules), were admitted to the study. The clinical outcome of the non-specific nodules was later found through follow-up or further exams (biopsy), finding 35 benign and 12 malignant. All MR Images were acquired at 1.5T, a first basal T1w sequence and then four T1w acquisitions after the paramagnetic contrast injection. After a manual segmentation of the lesions, done by a radiologist, and the extraction of 150 radiomic features (30 features per 5 subsequent times) a machine learning (ML) approach was used. An evolutionary algorithm (TWIST system based on KNN algorithm) was used to subdivide the dataset into training and validation test and to select features yielding the maximal amount of information. After this pre-processing, different machine learning systems were applied to develop a predictive model based on a training-testing crossover procedure. 10 cases with a benign nodule (follow-up older than 5 years) and 18 with an evident malignant tumor (clear malignant histological exam) were added to the dataset in order to allow the ML system to better learn from data. Results: NaiveBayes algorithm working on 79 features selected by a TWIST system, resulted to be the best performing ML system with a sensitivity of 96% and a specificity of 78% and a global accuracy of 87% (average values of two training-testing procedures ab-ba). The results showed that in the subset of 47 non-specific nodules, the algorithm predicted the outcome of 45 nodules which an expert radiologist could not identify. Conclusion: In this pilot study we identified a radiomic approach allowing ML systems to perform well in the diagnosis of a non-specific nodule at MR mammography. This algorithm could be a great support for the early diagnosis of malignant breast tumor, in the event the radiologist is not able to identify the kind of lesion and reduces the necessity for long follow-up. Clinical Relevance: This machine learning algorithm could be essential to support the radiologist in early diagnosis of non-specific nodules, in order to avoid strenuous follow-up and painful biopsy for the patient.

Keywords: breast, machine learning, MRI, radiomics

Procedia PDF Downloads 267
2772 Evaluation of the Quality of Care for Premature Babies in the Neonatology Unit of the Centre Hospitalier Universitaire de Kamenge

Authors: Kankurize Josiane, Nizigama Mediatrice

Abstract:

Introduction: Burundi records a still high infant mortality rate. Despite efforts to reduce it, prematurity is still the leading cause of death in the neonatal period. The objective of this study was to assess the quality of care for premature babies hospitalized in the neonatology unit of the Centre Hospitalier Universitaire de Kamenge. Method: This was a descriptive and evaluative prospective carried out in the neonatology unit of the CHUK (Centre Hospitalier Universitaire de Kamenge) from December 1, 2016, to May 31, 2017, including 70 premature babies, 65 mothers of premature babies and 15 providers including a pediatrician and 14 nurses. Using a tool developed by the World Health Organization and adapted to the local context by national experts, the quality of care for premature babies was assessed. Results: Prematurity accounted for 44.05% of hospitalizations in neonatology at the University Hospital of Kamenge. The assessment of the quality of care for premature babies was of low quality, with an average global score of 2/5 (50%), indicating that there is a considerable need for improvement to reach the standards. Conclusion: Efforts must be made to have infrastructures, materials, and human resources sufficient in quality and quantity so that the neonatology unit of the CHUK can be efficient and optimize the care of premature babies.

Keywords: quality of care, evaluation, premature, standards

Procedia PDF Downloads 60
2771 Optimizing Oil Production through 30-Inch Pipeline in Abu-Attifel Field

Authors: Ahmed Belgasem, Walid Ben Hussin, Emad Krekshi, Jamal Hashad

Abstract:

Waxy crude oil, characterized by its high paraffin wax content, poses significant challenges in the oil & gas industry due to its increased viscosity and semi-solid state at reduced temperatures. The wax formation process, which includes precipitation, crystallization, and deposition, becomes problematic when crude oil temperatures fall below the wax appearance temperature (WAT) or cloud point. Addressing these issues, this paper introduces a technical solution designed to mitigate the wax appearance and enhance the oil production process in Abu-Attifil Field via a 30-inch crude oil pipeline. A comprehensive flow assurance study validates the feasibility and performance of this solution across various production rates, temperatures, and operational scenarios. The study's findings indicate that maintaining the crude oil's temperature above a minimum threshold of 63°C is achievable through the strategic placement of two heating stations along the pipeline route. This approach effectively prevents wax deposition, gelling, and subsequent mobility complications, thereby bolstering the overall efficiency, reliability, safety, and economic viability of the production process. Moreover, this solution significantly curtails the environmental repercussions traditionally associated with wax deposition, which can accumulate up to 7,500kg. The research methodology involves a comprehensive flow assurance study to validate the feasibility and performance of the proposed solution. The study considers various production rates, temperatures, and operational scenarios. It includes crude oil analysis to determine the wax appearance temperature (WAT), as well as the evaluation and comparison of operating options for the heating stations. The study's findings indicate that the proposed solution effectively prevents wax deposition, gelling, and subsequent mobility complications. By maintaining the crude oil's temperature above the specified threshold, the solution improves the overall efficiency, reliability, safety, and economic viability of the oil production process. Additionally, the solution contributes to reducing environmental repercussions associated with wax deposition. The research conclusion presents a technical solution that optimizes oil production in the Abu-Attifil Field by addressing wax formation problems through the strategic placement of two heating stations. The solution effectively prevents wax deposition, improves overall operational efficiency, and contributes to environmental sustainability. Further research is suggested for field data validation and cost-benefit analysis exploration.

Keywords: oil production, wax depositions, solar cells, heating stations

Procedia PDF Downloads 73
2770 Study of Mechanical Properties of Aluminium Alloys on Normal Friction Stir Welding and Underwater Friction Stir Welding for Structural Applications

Authors: Lingaraju Dumpala, Laxmi Mohan Kumar Chintada, Devadas Deepu, Pravin Kumar Yadav

Abstract:

Friction stir welding is the new-fangled and cutting-edge technique in welding applications; it is widely used in the fields of transportation, aerospace, defense, etc. For thriving significant welding joints and properties of friction stir welded components, it is essential to carry out this advanced process in a prescribed systematic procedure. At this moment, Underwater Friction Stir Welding (UFSW) Process is the field of interest to do research work. In the continuous assessment, the study of UFSW process is to comprehend problems occurred in the past and the structure through which the mechanical properties of the welded joints can be value-added and contributes to conclude results an acceptable and resourceful joint. A meticulous criticism is given on how to modify the experimental setup from NFSW to UFSW. It can discern the influence of tool materials, feeds, spindle angle, load, rotational speeds and mechanical properties. By expending the DEFORM-3D simulation software, the achieved outcomes are validated.

Keywords: Underwater Friction Stir Welding(UFSW), Al alloys, mechanical properties, Normal Friction Stir Welding(NFSW)

Procedia PDF Downloads 288
2769 Centre of the Milky Way Galaxy

Authors: Svanik Garg

Abstract:

The center of our galaxy is often referred to as the ‘galactic center’ and has many theories associated with its true nature. Given the existence of interstellar dust and bright stars, it is nearly impossible to observe its position, about 24,000 light-years away. Due to this uncertainty, humans have often speculated what could exist at a vantage point upon which the entire galaxy spirals and revolves, with wild theories ranging from the presence of dark matter to black holes and wormholes. Data up till now on the same is very limited, and conclusions are to the best of the author's knowledge, as the only method to view the galactic center is through x-ray and infrared imaging, which counter the problems mentioned earlier. This paper examines, first, the existence of a galactic center, then the methods to identify what it might contain, and lastly, possible conclusions along with implications of the findings. Several secondary sources, along with a python tool to analyze x-ray readings were used to identify the true nature of what lies in the center of the galaxy, whether it be a void due to the existence of dark energy or a black hole. Using this roughly 4-part examination, as a result of this study, a plausible definition of the galactic center was formulated, keeping in mind the rather wild theories, data and different ideas proposed by researchers. This paper aims to dissect the theory of a galactic center and identify its nature to help understand what it shows about galaxies and our universe.

Keywords: milky way, galaxy, dark energy, stars

Procedia PDF Downloads 126
2768 The Antibacterial Efficacy of Gold Nanoparticles Derived from Gomphrena celosioides and Prunus amygdalus (Almond) Leaves on Selected Bacterial Pathogens

Authors: M. E. Abalaka, S. Y. Daniyan, S. O. Adeyemo, D. Damisa

Abstract:

Gold nanoparticles (AuNPs) have gained increasing interest in recent times. This is greatly due to their special features, which include unusual optical and electronic properties, high stability and biological compatibility, controllable morphology and size dispersion, and easy surface functionalization. In typical synthesis, AuNPs were produced by reduction of gold salt AuCl4 in an appropriate solvent. A stabilizing agent was added to prevent the particles from aggregating. The antibacterial activity of different sizes of gold nanoparticles was investigated against Staphylococcus aureus, Salmonella typhi and Pseudomonas pneumonia using the disk diffusion method in a Müeller–Hinton Agar. The Au-NPs were effective against all bacteria tested. That the Au-NPs were successfully synthesized in suspension and were used to study the antibacterial activity of the two medicinal plants against some bacterial pathogens suggests that Au-NPs can be employed as an effective bacteria inhibitor and may be an effective tool in medical field. The study clearly showed that the Au-NPs exhibiting inhibition towards the tested pathogenic bacteria in vitro could have the same effects in vivo and thus may be useful in the medical field if well researched into.

Keywords: gold nanoparticles, Gomphrena celesioides, Prunus amygdalus, pathogens

Procedia PDF Downloads 311
2767 Corporate Governance Role of Audit Committees in the Banking Sector: Evidence from Libya

Authors: Abdulaziz Abdulsaleh

Abstract:

This study aims at identifying the practices that should be taken into consideration by audit committees as a tool of corporate governance in Libyan commercial banks by investigating various perceptions on this topic. The study is based on a questionnaire submitted to audit committees ‘members at Libyan commercial banks, directors of internal audit departments as well as members of board of directors at these banks in addition to a number of external auditors and academic staff from Libyan universities. The study reveals that the role of audit committees has to be shifted from traditional areas of accounting to a broader role including functions related to financial reporting, audit planning, support the independence of internal and external auditors, acting as a channel of communication between external auditors and board of directors, reviewing external audit, and evaluating internal control systems. Although the study is a starting point in developing a framework of good audit committees’ practices in Libya, it is believed that the adoption of its results can result in enhancing the corporate governance practices not only in the banking sector but also in the entire corporate sector in Libya.

Keywords: audit committees, corporate governance, commercial banks, Libya

Procedia PDF Downloads 403
2766 Italian Speech Vowels Landmark Detection through the Legacy Tool 'xkl' with Integration of Combined CNNs and RNNs

Authors: Kaleem Kashif, Tayyaba Anam, Yizhi Wu

Abstract:

This paper introduces a methodology for advancing Italian speech vowels landmark detection within the distinctive feature-based speech recognition domain. Leveraging the legacy tool 'xkl' by integrating combined convolutional neural networks (CNNs) and recurrent neural networks (RNNs), the study presents a comprehensive enhancement to the 'xkl' legacy software. This integration incorporates re-assigned spectrogram methodologies, enabling meticulous acoustic analysis. Simultaneously, our proposed model, integrating combined CNNs and RNNs, demonstrates unprecedented precision and robustness in landmark detection. The augmentation of re-assigned spectrogram fusion within the 'xkl' software signifies a meticulous advancement, particularly enhancing precision related to vowel formant estimation. This augmentation catalyzes unparalleled accuracy in landmark detection, resulting in a substantial performance leap compared to conventional methods. The proposed model emerges as a state-of-the-art solution in the distinctive feature-based speech recognition systems domain. In the realm of deep learning, a synergistic integration of combined CNNs and RNNs is introduced, endowed with specialized temporal embeddings, harnessing self-attention mechanisms, and positional embeddings. The proposed model allows it to excel in capturing intricate dependencies within Italian speech vowels, rendering it highly adaptable and sophisticated in the distinctive feature domain. Furthermore, our advanced temporal modeling approach employs Bayesian temporal encoding, refining the measurement of inter-landmark intervals. Comparative analysis against state-of-the-art models reveals a substantial improvement in accuracy, highlighting the robustness and efficacy of the proposed methodology. Upon rigorous testing on a database (LaMIT) speech recorded in a silent room by four Italian native speakers, the landmark detector demonstrates exceptional performance, achieving a 95% true detection rate and a 10% false detection rate. A majority of missed landmarks were observed in proximity to reduced vowels. These promising results underscore the robust identifiability of landmarks within the speech waveform, establishing the feasibility of employing a landmark detector as a front end in a speech recognition system. The synergistic integration of re-assigned spectrogram fusion, CNNs, RNNs, and Bayesian temporal encoding not only signifies a significant advancement in Italian speech vowels landmark detection but also positions the proposed model as a leader in the field. The model offers distinct advantages, including unparalleled accuracy, adaptability, and sophistication, marking a milestone in the intersection of deep learning and distinctive feature-based speech recognition. This work contributes to the broader scientific community by presenting a methodologically rigorous framework for enhancing landmark detection accuracy in Italian speech vowels. The integration of cutting-edge techniques establishes a foundation for future advancements in speech signal processing, emphasizing the potential of the proposed model in practical applications across various domains requiring robust speech recognition systems.

Keywords: landmark detection, acoustic analysis, convolutional neural network, recurrent neural network

Procedia PDF Downloads 63