Search results for: data recognition
25426 Estimation of Missing Values in Aggregate Level Spatial Data
Authors: Amitha Puranik, V. S. Binu, Seena Biju
Abstract:
Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis
Procedia PDF Downloads 38025425 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 6325424 Assessment of Image Databases Used for Human Skin Detection Methods
Authors: Saleh Alshehri
Abstract:
Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases.Keywords: image databases, image processing, pattern recognition, neural networks
Procedia PDF Downloads 27025423 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments
Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic
Abstract:
Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder
Procedia PDF Downloads 28825422 Association Rules Mining and NOSQL Oriented Document in Big Data
Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub
Abstract:
Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL
Procedia PDF Downloads 15825421 Comparative Study of Skeletonization and Radial Distance Methods for Automated Finger Enumeration
Authors: Mohammad Hossain Mohammadi, Saif Al Ameri, Sana Ziaei, Jinane Mounsef
Abstract:
Automated enumeration of the number of hand fingers is widely used in several motion gaming and distance control applications, and is discussed in several published papers as a starting block for hand recognition systems. The automated finger enumeration technique should not only be accurate, but also must have a fast response for a moving-picture input. The high performance of video in motion games or distance control will inhibit the program’s overall speed, for image processing software such as Matlab need to produce results at high computation speeds. Since an automated finger enumeration with minimum error and processing time is desired, a comparative study between two finger enumeration techniques is presented and analyzed in this paper. In the pre-processing stage, various image processing functions were applied on a real-time video input to obtain the final cleaned auto-cropped image of the hand to be used for the two techniques. The first technique uses the known morphological tool of skeletonization to count the number of skeleton’s endpoints for fingers. The second technique uses a radial distance method to enumerate the number of fingers in order to obtain a one dimensional hand representation. For both discussed methods, the different steps of the algorithms are explained. Then, a comparative study analyzes the accuracy and speed of both techniques. Through experimental testing in different background conditions, it was observed that the radial distance method was more accurate and responsive to a real-time video input compared to the skeletonization method. All test results were generated in Matlab and were based on displaying a human hand for three different orientations on top of a plain color background. Finally, the limitations surrounding the enumeration techniques are presented.Keywords: comparative study, hand recognition, fingertip detection, skeletonization, radial distance, Matlab
Procedia PDF Downloads 38125420 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia
Authors: Melaku Tsehay
Abstract:
The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.Keywords: data quality, immunization, verification factor, pastoralist region
Procedia PDF Downloads 12025419 The Comparative Study of Attitudes toward Entrepreneurial Intention between ASEAN and Europe: An Analysis Using GEM Data
Authors: Suchart Tripopsakul
Abstract:
This paper uses data from the Global Entrepreneurship Monitor (GEM) to investigate the difference of attitudes towards entrepreneurial intention (EI). EI is generally assumed to be the single most relevant predictor of entrepreneurial behavior. The aim of this paper is to examine a range of attitudes effect on individual’s intent to start a new venture. A cross-cultural comparison between Asia and Europe is used to further investigate the possible differences between potential entrepreneurs from these distinct national contexts. The empirical analysis includes a GEM data set of 10 countries (n = 10,306) which was collected in 2013. Logistic regression is used to investigate the effect of individual’s attitudes on EI. Independent variables include individual’s perceived capabilities, the ability to recognize business opportunities, entrepreneurial network, risk perceptions as well as a range of socio-cultural attitudes. Moreover, a cross-cultural comparison of the model is conducted including six ASEAN (Malaysia, Indonesia, Philippines, Singapore, Vietnam and Thailand) and four European nations (Spain, Sweden, Germany, and the United Kingdom). The findings support the relationship between individual’s attitudes and their entrepreneurial intention. Individual’s capability, opportunity recognition, networks and a range of socio-cultural perceptions all influence EI significantly. The impact of media attention on entrepreneurship and was found to influence EI in ASEAN, but not in Europe. On the one hand, Fear of failure was found to influence EI in Europe, but not in ASEAN. The paper develops and empirically tests attitudes toward Entrepreneurial Intention between ASEAN and Europe. Interestingly, fear of failure was found to have no significant effect in ASEAN, and the impact of media attention on entrepreneurship and was found to influence EI in ASEAN. Moreover, the resistance of ASEAN entrepreneurs to the otherwise high rates of fear of failure and high impact of media attention are proposed as independent variables to explain the relatively high rates of entrepreneurial activity in ASEAN as reported by GEM. The paper utilizes a representative sample of 10,306 individuals in 10 countries. A range of attitudes was found to significantly influence entrepreneurial intention. Many of these perceptions, such as the impact of media attention on entrepreneurship can be manipulated by government policy. The paper also suggests strategies by which Asian economy in particular can benefit from their apparent high impact of media attention on entrepreneurship.Keywords: an entrepreneurial intention, attitude, GEM, ASEAN and Europe
Procedia PDF Downloads 31125418 Identifying Critical Success Factors for Data Quality Management through a Delphi Study
Authors: Maria Paula Santos, Ana Lucas
Abstract:
Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort
Procedia PDF Downloads 21625417 Information Visualization Methods Applied to Nanostructured Biosensors
Authors: Osvaldo N. Oliveira Jr.
Abstract:
The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique
Procedia PDF Downloads 33425416 Towards an Understanding of Social Capital in an Online Community of Filipino Music Artists
Authors: Jerome V. Cleofas
Abstract:
Cyberspace has become a more viable arena for budding artists to share musical acts through digital forms. The increasing relevance of online communities has attracted scholars from various fields demonstrating its influence on social capital. This paper extends this understanding of social capital among Filipino music artists belonging to the SoundCloud Philippines Facebook Group. The study makes use of various qualitative data obtained from key-informant interviews and participant observation of online and physical encounters, analyzed using the case study approach. Soundcloud Philippines has over seven-hundred members and is composed of Filipino singers, instrumentalists, composers, arrangers, producers, multimedia artists, and event managers. Group interactions are a mix of online encounters based on Facebook and SoundCloud and physical encounters through meet-ups and events. Benefits reaped from the community are informational, technical, instrumental, promotional, motivational, and social support. Under the guidance of online group administrators, collaborative activities such as music productions, concerts and events transpire. Most conflicts and problems arising are resolved peacefully. Social capital in SoundCloud Philippines is mobilized through recognition, respect and reciprocity.Keywords: Facebook, music artists, online communities, social capital
Procedia PDF Downloads 31725415 Job Stress Among the Nurses of the Emergency Department of Selected Saudi Hospital
Authors: Mahmoud Abdel Hameed Shahin
Abstract:
Job demands that are incompatible with an employee's skills, resources, or needs cause unpleasant emotional and physical reactions known as job stress. Nurses offer care in hospital emergency rooms all around the world, and since they operate in such a dynamic and unpredictable setting, they are constantly under pressure. It has been discovered that job stress has harmful impacts on nurses' health as well as their capacity to handle the demands of their jobs. The purpose of this study was to evaluate the level of job stress experienced by the emergency department nurses at King Fahad Specialist Hospital in Buraidah City, Saudi Arabia. In October 2021, a cross-sectional descriptive study was conducted. 80 nurses were conveniently selected for the study, the bulk of them worked at King Fahad Specialist Hospital's emergency department. An electronic questionnaire with a sociodemographic data sheet and a job stress scale was given to the participating nurses after ethical approval was received from the Ministry of Health's representative bodies. Using SPSS Version 26, both descriptive and inferential statistics were employed to analyze and tabulate the acquired data. According to the findings, the factors that contributed to the most job stress in the clinical setting were having an excessive amount of work to do and working under arbitrary deadlines, whereas the factors that contributed to the least stress were receiving the proper recognition or rewards for good work. In the emergency room of King Fahad Specialist Hospital, nurses had a moderate level of stress (M=3.32 ± 0.567/5). Based on their experience, emergency nurses' levels of job stress varied greatly, with nurses with less than a year of experience notably experiencing the lowest levels of job stress. The amount of job stress did not differ significantly based on the emergency nurses' age, nationality, gender, marital status, position, or level of education. The causes and impact of stress on emergency nurses should be identified and alleviated by hospitals through the implementation of interventional programs.Keywords: emergency nurses, job pressure, Qassim, Saudi Arabia, job stress
Procedia PDF Downloads 18725414 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine
Authors: Djamila Benhaddouche, Abdelkader Benyettou
Abstract:
In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction
Procedia PDF Downloads 55625413 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease
Authors: Usama Ahmed
Abstract:
Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.Keywords: data mining, classification, diabetes, WEKA
Procedia PDF Downloads 14525412 Comprehensive Study of Data Science
Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly
Abstract:
Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.Keywords: data science, machine learning, data analytics, artificial intelligence
Procedia PDF Downloads 8025411 Dual Biometrics Fusion Based Recognition System
Authors: Prakash, Vikash Kumar, Vinay Bansal, L. N. Das
Abstract:
Dual biometrics is a subpart of multimodal biometrics, which refers to the use of a variety of modalities to identify and authenticate persons rather than just one. We limit the risks of mistakes by mixing several modals, and hackers have a tiny possibility of collecting information. Our goal is to collect the precise characteristics of iris and palmprint, produce a fusion of both methodologies, and ensure that authentication is only successful when the biometrics match a particular user. After combining different modalities, we created an effective strategy with a mean DI and EER of 2.41 and 5.21, respectively. A biometric system has been proposed.Keywords: multimodal, fusion, palmprint, Iris, EER, DI
Procedia PDF Downloads 14525410 Current Issues of Cross-Border Enforcement
Authors: Gábor Kocsmárik
Abstract:
The topic of this is coercive measures against assets in which the factor of the procedure contains a foreign element. We speak of cross-border enforcement if the debtor or the property requesting enforcement or subject to enforcement is not located in the bordering country. Given that the jurisdiction of a country cannot extend beyond its borders, the cooperation of nations and the mutual recognition of their decisions are necessary to eliminate this. In addition, it is essential to create framework rules that are binding and enforceable for each country participating in the convention. During the study, some conventions between countries that are still in force will be presented, which can serve as a starting point for dealing with existing problems.Keywords: law, execution, civil procedure law, international
Procedia PDF Downloads 3425409 Pupil Size: A Measure of Identification Memory in Target Present Lineups
Authors: Camilla Elphick, Graham Hole, Samuel Hutton, Graham Pike
Abstract:
Pupil size has been found to change irrespective of luminosity, suggesting that it can be used to make inferences about cognitive processes, such as cognitive load. To see whether identifying a target requires a different cognitive load to rejecting distractors, the effect of viewing a target (compared with viewing distractors) on pupil size was investigated using a sequential video lineup procedure with two lineup sessions. Forty one participants were chosen randomly via the university. Pupil sizes were recorded when viewing pre target distractors and post target distractors and compared to pupil size when viewing the target. Overall, pupil size was significantly larger when viewing the target compared with viewing distractors. In the first session, pupil size changes were significantly different between participants who identified the target (Hits) and those who did not. Specifically, the pupil size of Hits reduced significantly after viewing the target (by 26%), suggesting that cognitive load reduced following identification. The pupil sizes of Misses (who made no identification) and False Alarms (who misidentified a distractor) did not reduce, suggesting that the cognitive load remained high in participants who failed to make the correct identification. In the second session, pupil sizes were smaller overall, suggesting that cognitive load was smaller in this session, and there was no significant difference between Hits, Misses and False Alarms. Furthermore, while the frequency of Hits increased, so did False Alarms. These two findings suggest that the benefits of including a second session remain uncertain, as the second session neither provided greater accuracy nor a reliable way to measure it. It is concluded that pupil size is a measure of face recognition strength in the first session of a target present lineup procedure. However, it is still not known whether cognitive load is an adequate explanation for this, or whether cognitive engagement might describe the effect more appropriately. If cognitive load and cognitive engagement can be teased apart with further investigation, this would have positive implications for understanding eyewitness identification. Nevertheless, this research has the potential to provide a tool for improving the reliability of lineup procedures.Keywords: cognitive load, eyewitness identification, face recognition, pupillometry
Procedia PDF Downloads 40325408 A CORDIC Based Design Technique for Efficient Computation of DCT
Authors: Deboraj Muchahary, Amlan Deep Borah Abir J. Mondal, Alak Majumder
Abstract:
A discrete cosine transform (DCT) is described and a technique to compute it using fast Fourier transform (FFT) is developed. In this work, DCT of a finite length sequence is obtained by incorporating CORDIC methodology in radix-2 FFT algorithm. The proposed methodology is simple to comprehend and maintains a regular structure, thereby reducing computational complexity. DCTs are used extensively in the area of digital processing for the purpose of pattern recognition. So the efficient computation of DCT maintaining a transparent design flow is highly solicited.Keywords: DCT, DFT, CORDIC, FFT
Procedia PDF Downloads 47725407 Mesalazine-Induced Myopericarditis in a Professional Athlete
Authors: Tristan R. Fraser, Christopher D. Steadman, Christopher J. Boos
Abstract:
Myopericarditis is an inflammation syndrome characterised by clinical diagnostic criteria for pericarditis, such as chest pain, combined with evidence of myocardial involvement, such as elevation of biomarkers of myocardial damage, e.g., troponins. It can rarely be a complication of therapeutics used for dysregulated immune-mediated diseases such as inflammatory bowel disease (IBD), for example, mesalazine. The infrequency of mesalazine-induced myopericarditis adds to the challenge in its recognition. Rapid diagnosis and the early introduction of treatment are crucial. This case report follows a 24-year-old professional footballer with a past medical history of ulcerative colitis, recently started on mesalazine for disease control. Three weeks after mesalazine was initiated, he was admitted with fever, shortness of breath, and chest pain worse whilst supine and on deep inspiration, as well as elevated venous blood cardiac troponin T level (cTnT, 288ng/L; normal: <13ng/L). Myocarditis was confirmed on initial inpatient cardiac MRI, revealing the presence of florid myocarditis with preserved left ventricular systolic function and an ejection fraction of 67%. This was a longitudinal case study following the progress of a single individual with myopericarditis over four acute hospital admissions over nine weeks, with admissions ranging from two to five days. Parameters examined included clinical signs and symptoms, serum troponin, transthoracic echocardiogram, and cardiac MRI. Serial measurements of cardiac function, including cardiac MRI and transthoracic echocardiogram, showed progressive deterioration of cardiac function whilst mesalazine was continued. Prior to cessation of mesalazine, transthoracic echocardiography revealed a small global pericardial effusion of < 1cm and worsening left ventricular systolic function with an ejection fraction of 45%. After recognition of mesalazine as a potential cause and consequent cessation of the drug, symptoms resolved, with cardiac MRI performed as an outpatient showing resolution of myocardial oedema. The patient plans to make a return to competitive sport. Patients suffering from myopericarditis are advised to refrain from competitive sport for at least six months in order to reduce the risk of cardiac remodelling and sudden cardiac death. Additional considerations must be taken in individuals for whom competitive sport is an essential component of their livelihood, such as professional athletes. Myopericarditis is an uncommon, however potentially serious medical condition with a wide variety of aetiologies, including viral, autoimmune, and drug-related causes. Management is mainly supportive and relies on prompt recognition and removal of the aetiological process. Mesalazine-induced myopericarditis is a rare condition; as such increasing awareness of mesalazine as a precipitant of myopericarditis is vital for optimising the management of these patients.Keywords: myopericarditis, mesalazine, inflammatory bowel disease, professional athlete
Procedia PDF Downloads 13525406 Sorting Fish by Hu Moments
Authors: J. M. Hernández-Ontiveros, E. E. García-Guerrero, E. Inzunza-González, O. R. López-Bonilla
Abstract:
This paper presents the implementation of an algorithm that identifies and accounts different fish species: Catfish, Sea bream, Sawfish, Tilapia, and Totoaba. The main contribution of the method is the fusion of the characteristics of invariance to the position, rotation and scale of the Hu moments, with the proper counting of fish. The identification and counting is performed, from an image under different noise conditions. From the experimental results obtained, it is inferred the potentiality of the proposed algorithm to be applied in different scenarios of aquaculture production.Keywords: counting fish, digital image processing, invariant moments, pattern recognition
Procedia PDF Downloads 40625405 Francophone University Students' Attitudes Towards English Accents in Cameroon
Authors: Eric Agrie Ambele
Abstract:
The norms and models for learning pronunciation in relation to the teaching and learning of English pronunciation are key issues nowadays in English Language Teaching in ESL contexts. This paper discusses these issues based on a study on the attitudes of some Francophone university students in Cameroon towards three English accents spoken in Cameroon: Cameroon Francophone English (CamFE), Cameroon English (CamE), and Hyperlectal Cameroon English (near standard British English). With the desire to know more about the treatment that these English accents receive among these students, an aspect that had hitherto received little attention in the literature, a language attitude questionnaire, and the matched-guise technique was used to investigate this phenomenon. Two methods of data analysis were employed: (1) the percentage count procedure, and (2) the semantic differential scale. The findings reveal that the participants’ attitudes towards the selected accents vary in degree. Though Hyperlectal CamE emerged first, CamE second and CamFE third, no accent, on average, received a negative evaluation. It can be deduced from this findings that, first, CamE is gaining more and more recognition and can stand as an autonomous accent; second, that the participants all rated Hyperlectal CamE higher than CamE implies that they would be less motivated in a context where CamE is the learning model. By implication, in the teaching of English pronunciation to francophone learners learning English in Cameroon, Hyperlectal Cameroon English should be the model.Keywords: teaching pronunciation, English accents, Francophone learners, attitudes
Procedia PDF Downloads 19225404 Employee Engagement
Authors: Jai Bakliya, Palak Dhamecha
Abstract:
Today customer satisfaction is given utmost priority be it any industry. But when it comes to hospitality industry this applies even more as they come in direct contact with customers while providing them services. Employee engagement is new concept adopted by Human Resource Department which impacts customer satisfactions. To satisfy your customers, it is necessary to see that the employees in the organisation are satisfied and engaged enough in their work that they meet the company’s expectations and contribute in the process of achieving company’s goals and objectives. After all employees is human capital of the organisation. Employee engagement has become a top business priority for every organisation. In this fast moving economy, business leaders know that having a potential and high-performing human resource is important for growth and survival. They recognize that a highly engaged manpower can increase innovation, productivity, and performance, while reducing costs related to retention and hiring in highly competitive talent markets. But while most executives see a clear need to improve employee engagement, many have yet to develop tangible ways to measure and tackle this goal. Employee Engagement is an approach which is applied to establish an emotional connection between an employee and the organisation which ensures the employee’s commitment towards his work which affects the productivity and overall performance of the organisation. The study was conducted in hospitality industry. A popular branded hotel was chosen as a sample unit. Data were collected, both qualitative and quantitative from respondents. It is found that employee engagement level of the organisation (Hotel) is quite low. This means that employees are not emotionally connected with the organisation which may in turn, affect performance of the employees it is important to note that in hospitality industry individual employee’s performance specifically in terms of emotional engagement is critical and, therefore, a low engagement level may contribute to low organisation performance. An attempt to this study was made to identify employee engagement level. Another objective to take this study was to explore the factors impeding employee engagement and to explore employee engagement facilitation. While in the hospitality industry where people tend to work for as long as 16 to 18 hours concepts like employee engagement is essential. Because employees get tired of their routine job and in case where job rotation cannot be done employee engagement acts as a solution. The study was conducted at Trident Hotel, Udaipur. It was conducted on the sample size of 30 in-house employees from 6 different departments. The various departments were: Accounts and General, Front Office, Food & Beverage Service, Housekeeping, Food & Beverage Production and Engineering. It was conducted with the help of research instrument. The research instrument was Questionnaire. Data collection source was primary source. Trident Udaipur is one of the busiest hotels in Udaipur. The occupancy rate of the guest over there is nearly 80%. Due the high occupancy rate employees or staff of the hotel used to remain very busy and occupied all the time in their work. They worked for their remuneration only. As a result, they do not have any encouragement for their work nor they are interested in going an extra mile for the organisation. The study result shows working environment factors including recognition and appreciation, opinions of the employee, counselling, feedback from superiors, treatment of managers and respect from the organisation are capable of increasing employee engagement level in the hotel. The above study result encouraged us to explore the factors contributed to low employee engagement. It is being found that factors such as recognition and appreciation, feedback from supervisors, opinion of the employee, counselling, feedback from supervisors, treatment from managers has contributed negatively to employee engagement level. Probable reasons for the low contribution are number of employees gave the negative feedback in accordance to the factors stated above of the organisation. It seems that the structure of organisation itself is responsible for the low contribution of employee engagement. The scope of this study is limited to trident hotel situated in the Udaipur. The limitation of the study was that that the results or findings were only based on the responses of respondents of Trident, Udaipur. And so the recommendations were also applicable in Trident, Udaipur and not to all the like organisations across the country. Through the data collected was further analysed, interpreted and concluded. On the basis of the findings, suggestions were provided to the hotel for improvisation.Keywords: human resource, employee engagement, research, study
Procedia PDF Downloads 30625403 Application of Artificial Neural Network Technique for Diagnosing Asthma
Authors: Azadeh Bashiri
Abstract:
Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.Keywords: asthma, data mining, Artificial Neural Network, intelligent system
Procedia PDF Downloads 27325402 Interpreting Privacy Harms from a Non-Economic Perspective
Authors: Christopher Muhawe, Masooda Bashir
Abstract:
With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.Keywords: data breach and misuse, economic harms, privacy harms, psychological harms
Procedia PDF Downloads 19525401 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 4125400 Exploring the Role of Immune-Modulators in Pathogen Recognition Receptor NOD2 Mediated Protection against Visceral Leishmaniasis
Authors: Junaid Jibran Jawed, Prasanta Saini, Subrata Majumdar
Abstract:
Background: Leishmania donovani infection causes severe host immune-suppression through the modulation of pathogen recognition receptors. Apart from TLRs (Toll Like Receptor), recent studies focus on the important contribution of NLR (NOD-Like Receptor) family member NOD1 and NOD2 as these receptors are capable of triggering host innate immunity. The aim of this study was to decipher the role of NOD1/NOD2 receptors during experimental visceral leishmaniasis (VL) and the important link between host failure and parasite evasion strategy. Method: The status of NOD1 and NOD2 receptors were analysed in uninfected and infected cells through western blotting and RT-PCR. The active contributions of these receptors in reducing parasite burden were confirmed by siRNA mediated silencing, and over-expression studies and the parasite numbers were calculated through microscopic examination of the Giemsa-stained slides. In-vivo studies were done by using non-toxic dose of Mw (Mycobacterium indicus pranii), Ara-LAM(Arabinoasylated lipoarabinomannan) along with MDP (Muramyl dipeptide) administration. Result: Leishmania donovani infection of the macrophages reduced the expression of NOD2 receptors whereas NOD1 remain unaffected. MDP, a NOD2-ligand, treatment during over-expression of NOD2, reduced the parasite burden effectively which was associated with increased pro-inflammatory cytokine generation and NO production. In experimental mouse model, Ara-LAM treatment increased the expression of NOD2 and in combination with MDP it showed active therapeutic potential against VL and found to be more effective than Mw which was already reported to be involved in NOD2 modulation. Conclusion: This work explores the essential contribution of NOD2 during experimental VL and mechanistic understanding of Ara-LAM + MDP combination therapy to work against this disease and highlighted NOD2 as an essential therapeutic target.Keywords: Ara-LAM (Arabinoacylated Lipoarabinomannan), NOD2 (nucleotide binding oligomerization receptor 2), MDP (muramyl di peptide), visceral Leishmaniasis
Procedia PDF Downloads 17425399 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course
Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu
Abstract:
This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN
Procedia PDF Downloads 4325398 Fabrication of Highly Stable Low-Density Self-Assembled Monolayers by Thiolyne Click Reaction
Authors: Leila Safazadeh, Brad Berron
Abstract:
Self-assembled monolayers have tremendous impact in interfacial science, due to the unique opportunity they offer to tailor surface properties. Low-density self-assembled monolayers are an emerging class of monolayers where the environment-interfacing portion of the adsorbate has a greater level of conformational freedom when compared to traditional monolayer chemistries. This greater range of motion and increased spacing between surface-bound molecules offers new opportunities in tailoring adsorption phenomena in sensing systems. In particular, we expect low-density surfaces to offer a unique opportunity to intercalate surface bound ligands into the secondary structure of protiens and other macromolecules. Additionally, as many conventional sensing surfaces are built upon gold surfaces (SPR or QCM), these surfaces must be compatible with gold substrates. Here, we present the first stable method of generating low-density self assembled monolayer surfaces on gold for the analysis of their interactions with protein targets. Our approach is based on the 2:1 addition of thiol-yne chemistry to develop new classes of y-shaped adsorbates on gold, where the environment-interfacing group is spaced laterally from neighboring chemical groups. This technique involves an initial deposition of a crystalline monolayer of 1,10 decanedithiol on the gold substrate, followed by grafting of a low-packed monolayer on through a photoinitiated thiol-yne reaction in presence of light. Orthogonality of the thiol-yne chemistry (commonly referred to as a click chemistry) allows for preparation of low-density monolayers with variety of functional groups. To date, carboxyl, amine, alcohol, and alkyl terminated monolayers have been prepared using this core technology. Results from surface characterization techniques such as FTIR, contact angle goniometry and electrochemical impedance spectroscopy confirm the proposed low chain-chain interactions of the environment interfacing groups. Reductive desorption measurements suggest a higher stability for the click-LDMs compared to traditional SAMs, along with the equivalent packing density at the substrate interface, which confirms the proposed stability of the monolayer-gold interface. In addition, contact angle measurements change in the presence of an applied potential, supporting our description of a surface structure which allows the alkyl chains to freely orient themselves in response to different environments. We are studying the differences in protein adsorption phenomena between well packed and our loosely packed surfaces, and we expect this data will be ready to present at the GRC meeting. This work aims to contribute biotechnology science in the following manner: Molecularly imprinted polymers are a promising recognition mode with several advantages over natural antibodies in the recognition of small molecules. However, because of their bulk polymer structure, they are poorly suited for the rapid diffusion desired for recognition of proteins and other macromolecules. Molecularly imprinted monolayers are an emerging class of materials where the surface is imprinted, and there is not a bulk material to impede mass transfer. Further, the short distance between the binding site and the signal transduction material improves many modes of detection. My dissertation project is to develop a new chemistry for protein-imprinted self-assembled monolayers on gold, for incorporation into SPR sensors. Our unique contribution is the spatial imprinting of not only physical cues (seen in current imprinted monolayer techniques), but to also incorporate complementary chemical cues. This is accomplished through a photo-click grafting of preassembled ligands around a protein template. This conference is important for my development as a graduate student to broaden my appreciation of the sensor development beyond surface chemistry.Keywords: low-density self-assembled monolayers, thiol-yne click reaction, molecular imprinting
Procedia PDF Downloads 22425397 Data Access, AI Intensity, and Scale Advantages
Authors: Chuping Lo
Abstract:
This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.Keywords: digital intensity, digital divide, international trade, scale of economics
Procedia PDF Downloads 66