Search results for: search algorithms
1036 Modeling of Large Elasto-Plastic Deformations by the Coupled FE-EFGM
Authors: Azher Jameel, Ghulam Ashraf Harmain
Abstract:
In the recent years, the enriched techniques like the extended finite element method, the element free Galerkin method, and the Coupled finite element-element free Galerkin method have found wide application in modeling different types of discontinuities produced by cracks, contact surfaces, and bi-material interfaces. The extended finite element method faces severe mesh distortion issues while modeling large deformation problems. The element free Galerkin method does not have mesh distortion issues, but it is computationally more demanding than the finite element method. The coupled FE-EFGM proves to be an efficient numerical tool for modeling large deformation problems as it exploits the advantages of both FEM and EFGM. The present paper employs the coupled FE-EFGM to model large elastoplastic deformations in bi-material engineering components. The large deformation occurring in the domain has been modeled by using the total Lagrangian approach. The non-linear elastoplastic behavior of the material has been represented by the Ramberg-Osgood model. The elastic predictor-plastic corrector algorithms are used for the evaluation stresses during large deformation. Finally, several numerical problems are solved by the coupled FE-EFGM to illustrate its applicability, efficiency and accuracy in modeling large elastoplastic deformations in bi-material samples. The results obtained by the proposed technique are compared with the results obtained by XFEM and EFGM. A remarkable agreement was observed between the results obtained by the three techniques.Keywords: XFEM, EFGM, coupled FE-EFGM, level sets, large deformation
Procedia PDF Downloads 4471035 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)
Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang
Abstract:
The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.
Procedia PDF Downloads 981034 Application of the Critical Decision Method for Monitoring and Improving Safety in the Construction Industry
Authors: Juan Carlos Rubio Romero, Francico Salguero Caparros, Virginia Herrera-Pérez
Abstract:
No one is in the slightest doubt about the high levels of risk involved in work in the construction industry. They are even higher in structural construction work. The Critical Decision Method (CDM) is a semi-structured interview technique that uses cognitive tests to identify the different disturbances that workers have to deal with in their work activity. At present, the vision of safety focused on daily performance and things that go well for safety and health management is facing the new paradigm known as Resilience Engineering. The aim of this study has been to describe the variability in formwork labour on concrete structures in the construction industry and, from there, to find out the resilient attitude of workers to unexpected events that they have experienced during their working lives. For this purpose, a series of semi-structured interviews were carried out with construction employees with extensive experience in formwork labour in Spain by applying the Critical Decision Method. This work has been the first application of the Critical Decision Method in the field of construction and, more specifically, in the execution of structures. The results obtained show that situations categorised as unthought-of are identified to a greater extent than potentially unexpected situations. The identification during these interviews of both expected and unexpected events provides insight into the critical decisions made and actions taken to improve resilience in daily practice in this construction work. From this study, it is clear that it is essential to gain more knowledge about the nature of the human cognitive process in work situations within complex socio-technical systems such as construction sites. This could lead to a more effective design of workplaces in the search for improved human performance.Keywords: resilience engineering, construction industry, unthought-of situations, critical decision method
Procedia PDF Downloads 1481033 Second Generation Mozambican Migrant Youth’s Identity and Sense of Belonging: The Case of Hluvukani Village in Bushbuckridge, Mpumalanga
Authors: Betty Chiyangwa
Abstract:
This is a work in progress project focused on exploring the complexities surrounding the second generation Mozambican migrant youth’s experiences to construct their identity and develop a sense of belonging in post-apartheid, Bushbuckridge in South Africa. Established in 1884, Bushbuckridge is one of the earliest districts to accommodate Mozambicans who migrated to South Africa in the 1970s. Bushbuckridge as a destination for Mozambican migrants is crucial to their search for social freedom and space to “belong to.” The action of deliberately seeking freedom is known as an act of agency. Four major objectives govern the paper. The first objective observes how second-generation Mozambican migrant youth living in South Africa negotiate and construct their own identities. Secondly, it explores second-generation Mozambican migrant youth narratives regarding their sense of belonging in South Africa. Thirdly, the study intends to understand how social processes of identity and belonging influence second-generation Mozambican migrant youth experiences and future aspirations in South Africa. The last objective examines how Sen’s Capability approach is relevant in understanding second-generation Mozambican migrant youth identity and belonging in South Africa. This is a single case study informed by data from semi-structured interviews and narratives with youth between the ages of 18 and 34 who are born and raised in South Africa to at least one former Mozambican refugee parent living in Bushbuckridge. Drawing from Crenshaw’s Intersectionality and Sen’s Capability approaches, this study significantly contributes to the existing body of knowledge on South to South migration by demonstrating how both approaches can be operationalized towards understanding complex experiences and capabilities of the disadvantaged group simultaneously. The subject of second-generation migrants is often under-researched in South African migration; thus, their perspectives have been marginalized in Social Science research.Keywords: second-generation, Mozambican, migrant, youth, bushbuckridge
Procedia PDF Downloads 2201032 Health Information Technology in Developing Countries: A Structured Literature Review with Reference to the Case of Libya
Authors: Haythem A. Nakkas, Philip J. Scott, Jim S. Briggs
Abstract:
This paper reports a structured literature review of the application of Health Information Technology in developing countries, defined as the World Bank categories Low-income countries, Lower-middle-income, and Upper-middle-income countries. The aim was to identify and classify the various applications of health information technology to assess its current state in developing countries and explore potential areas of research. We offer specific analysis and application of HIT in Libya as one of the developing countries. Method: A structured literature review was conducted using the following online databases: IEEE, Science Direct, PubMed, and Google Scholar. Publication dates were set for 2000-2013. For the PubMed search, publications in English, French, and Arabic were specified. Using a content analysis approach, 159 papers were analyzed and a total number of 26 factors were identified that affect the adoption of health information technology. Results: Of the 2681 retrieved articles, 159 met the inclusion criteria which were carefully analyzed and classified. Conclusion: The implementation of health information technology across developing countries is varied. Whilst it was initially expected financial constraints would have severely limited health information technology implementation, some developing countries like India have nevertheless dominated the literature and taken the lead in conducting scientific research. Comparing the number of studies to the number of countries in each category, we found that Low-income countries and Lower-middle-income had more studies carried out than Upper-middle-income countries. However, whilst IT has been used in various sectors of the economy, the healthcare sector in developing countries is still failing to benefit fully from the potential advantages that IT can offer.Keywords: developing countries, developed countries, factors, failure, health information technology, implementation, libya, success
Procedia PDF Downloads 3591031 A Comparative Study for Various Techniques Using WEKA for Red Blood Cells Classification
Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy
Abstract:
Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifyig the red blood cells as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectivelyKeywords: red blood cells, classification, radial basis function neural networks, suport vector machine, k-nearest neighbors algorithm
Procedia PDF Downloads 4801030 A Proteomic Approach for Discovery of Microbial Cellulolytic Enzymes
Authors: M. S. Matlala, I. Ignatious
Abstract:
Environmental sustainability has taken the center stage in human life all over the world. Energy is the most essential component of our life. The conventional sources of energy are non-renewable and have a detrimental environmental impact. Therefore, there is a need to move from conventional to non-conventional renewable energy sources to satisfy the world’s energy demands. The study aimed at screening for microbial cellulolytic enzymes using a proteomic approach. The objectives were to screen for microbial cellulases with high specific activity and separate the cellulolytic enzymes using a combination of zymography and two-dimensional (2-D) gel electrophoresis followed by tryptic digestion, Matrix-assisted Laser Desorption Ionisation-Time of Flight (MALDI-TOF) and bioinformatics analysis. Fungal and bacterial isolates were cultured in M9 minimal and Mandel media for a period of 168 hours at 60°C and 30°C with cellobiose and Avicel as carbon sources. Microbial cells were separated from supernatants through centrifugation, and the crude enzyme from the cultures was used for the determination of cellulase activity, zymography, SDS-PAGE, and two-dimensional gel electrophoresis. Five isolates, with lytic action on carbon sources studied, were a bacterial strain (BARK) and fungal strains (VCFF1, VCFF14, VCFF17, and VCFF18). Peak cellulase production by the selected isolates was found to be 3.8U/ml, 2.09U/ml, 3.38U/ml, 3.18U/ml, and 1.95U/ml, respectively. Two-dimensional gel protein maps resulted in the separation and quantitative expression of different proteins by the microbial isolates. MALDI-TOF analysis and database search showed that the expressed proteins in this study closely relate to different glycoside hydrolases produced by other microbial species with an acceptable confidence level of 100%.Keywords: cellulases, energy, two-dimensional gel electrophoresis, matrix-assisted laser desorption ionisation-time of flight, MALDI-TOF MS
Procedia PDF Downloads 1341029 A Novel Methodology for Browser Forensics to Retrieve Searched Keywords from Windows 10 Physical Memory Dump
Authors: Dija Sulekha
Abstract:
Nowadays, a good percentage of reported cybercrimes involve the usage of the Internet, directly or indirectly for committing the crime. Usually, Web Browsers leave traces of browsing activities on the host computer’s hard disk, which can be used by investigators to identify internet-based activities of the suspect. But criminals, who involve in some organized crimes, disable browser file generation feature to hide the evidence while doing illegal activities through the Internet. In such cases, even though browser files were not generated in the storage media of the system, traces of recent and ongoing activities were generated in the Physical Memory of the system. As a result, the analysis of Physical Memory Dump collected from the suspect's machine retrieves lots of forensically crucial information related to the browsing history of the Suspect. This information enables the cyber forensic investigators to concentrate on a few highly relevant selected artefacts while doing the Offline Forensics analysis of storage media. This paper addresses the reconstruction of web browsing activities by conducting live forensics to identify searched terms, downloaded files, visited sites, email headers, email ids, etc. from the physical memory dump collected from Windows 10 Systems. Well-known entry points are available for retrieving all the above artefacts except searched terms. The paper describes a novel methodology to retrieve the searched terms from Windows 10 Physical Memory. The searched terms retrieved in this way can be used for doing advanced file and keyword search in the storage media files reconstructed from the file system recovery in offline forensics.Keywords: browser forensics, digital forensics, live Forensics, physical memory forensics
Procedia PDF Downloads 1161028 Insight into the Visual Attentional Correlates Underpinning Autistic-Like Traits in Fragile X and Down Syndrome
Authors: Jennifer M. Glennon, Hana D'Souza, Luke Mason, Annette Karmiloff-Smith, Michael S. C. Thomas
Abstract:
Genetic syndrome groups that feature high rates of autism comorbidity, like Down syndrome (DS) and fragile X syndrome (FXS), have been presented as useful models for understanding risk and protective factors involved in the emergence of autistic traits. Yet despite reaching clinical thresholds, these ‘syndromic’ forms of autism appear to differ in important ways from the idiopathic or ‘non-syndromic’ autism phenotype. To uncover the true nature of these comorbidities, it is necessary to extend definitions of autism to include the cognitive characteristics of the disorder and to then apply this broadened conceptualisation to the study of syndromic autism profiles. The current study employs a variety of well-established eye-tracking paradigms to assess visual attentional performance in children with DS and FXS who reach thresholds for autism on the Social Communication Questionnaire. It investigates whether autism profiles in these children are accompanied by visual orienting difficulties (‘sticky attention’), decreased social attention, and enhanced visual search performance, all of which are characteristic of the idiopathic autism phenotype. Data is collected from children with DS and FXS aged between 6 and 10 years, in addition to two control groups matched on age and intellectual ability (i.e., children with idiopathic autism and neurotypical controls). Cross-sectional developmental trajectory analyses are conducted to enable visuo-attentional profile comparisons. Significant differences in the visuo-attentional processes underpinning autism presentations in children with FXS and DS are hypothesised, supporting notions of syndrome specificity. The study provides insight into the complex heterogeneity associated with syndromic autism presentations and autism per se, with clinical implications for the utility of autism intervention programmes in DS and FXS populations.Keywords: autism, down syndrome, fragile X syndrome, eye tracking
Procedia PDF Downloads 2391027 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 471026 In Search for the 'Bilingual Advantage' in Immersion Education
Authors: M. E. Joret, F. Germeys, P. Van de Craen
Abstract:
Background: Previous studies have shown that ‘full’ bilingualism seems to enhance the executive functions in children, young adults and elderly people. Executive functions refer to a complex cognitive system responsible for self-controlled and planned behavior and seem to predict academic achievement. The present study aimed at investigating whether similar effects could be found in children learning their second language at school in immersion education programs. Methods: In this study, 44 children involved in immersion education for 4 to 5 years were compared to 48 children in traditional schools. All children were between 9 and 11 years old. To assess executive functions, the Simon task was used, a neuropsychological measure assessing executive functions with reaction times and accuracy on congruent and incongruent trials. To control for background measures, all children underwent the Raven’s coloured progressive matrices, to measure non-verbal intelligence and the Echelle de Vocabulaire en Images Peabody (EVIP), assessing verbal intelligence. In addition, a questionnaire was given to the parents to control for other confounding variables, such as socio-economic status (SES), home language, developmental disorders, etc. Results: There were no differences between groups concerning non-verbal intelligence and verbal intelligence. Furthermore, the immersion learners showed overall faster reaction times on both congruent and incongruent trials compared to the traditional learners, but only after 5 years of training, not before. Conclusion: These results show that the cognitive benefits found in ‘full’ bilinguals also appear in children involved in immersion education, but only after a sufficient exposure to the second language. Our results suggest that the amount of second language training needs to be sufficient before these cognitive effects may emerge.Keywords: bilingualism, executive functions, immersion education, Simon task
Procedia PDF Downloads 4411025 Determinants of Child Malnutrition in Sub-Saharan Africa
Authors: Habtamu Fufa, Yemane Berhane
Abstract:
Child under nutrition has long-term consequences for intellectual ability, economic productivity, reproductive performance and susceptibility to metabolic and cardiovascular disease. The unacceptably high prevalence of malnutrition in young children of the region has not changed much over the last decades, which could make the achievement of the corresponding Millennium Development Goals very unlikely. Despite the well-documented problems of child malnutrition in Sub-Saharan Africa, there is few systematic review of evidences on determinants of child malnutrition in the region. The current available evidence on determinants of child under nutrition in Sub-Saharan Africa is systematically reviewed. The method used in searching relevant literature was using bio medical databases PUBMED, Google scholar and the website of the World Health Organization on nutrition using the following key words: "Determinants “, "Child Malnutrition", and "Sub- Saharan Africa". The search was limited to articles published in and after 1995 up to date. In all the reviewed articles, the data were analyzed using multivariate regression analysis and or odds ratios for significance of determinants in child malnutrition. Synthesis of 40 published articles from various countries of the region is done and noted that household economic status, maternal education, disease, breastfeeding practices, age and sex of a child, birth interval and residential areas were found to be determinants of child under nutrition. Poverty remains the main factor of malnutrition in Sub-Saharan Africa and poor education of parents aggravates the malnutrition through perpetuation of poor nutrition practices. Male children under five years are the most affected ones. Understanding of these determinants of poor nutritional attainment would provide insights in designing interventions for reducing the high levels of child malnutrition in this region. Large-scale multi-sectoral community-based interventions are urgently needed for a sustainable improvement of child nutritional & health status in Sub-Saharan Africa.Keywords: child malnutrition, determinants, Sub-Saharan Africa, health status
Procedia PDF Downloads 4791024 Socio-Cultural Representations through Lived Religions in Dalrymple’s Nine Lives
Authors: Suman
Abstract:
In the continuous interaction between the past and the present that historiography is, each time when history gets re/written, a new representation emerges. This new representation is a reflection of the earlier archives and their interpretations, fragmented remembrances of the past, as well as the reactions to the present. Memory, or lack thereof, and stereotyping generally play a major role in this representation. William Dalrymple’s Nine Lives: In Search of the Sacred in Modern India (2009) is one such written account that sets out to narrate the representations of religion and culture of India and contemporary reactions to it. Dalrymple’s nine saints belong to different castes, sects, religions, and regions. By dealing with their religions and expressions of those religions, and through the lived mysticism of these nine individuals, the book engages with some important issues like class, caste and gender in the contexts provided by historical as well as present India. The paper studies the development of religion and accompanied feeling of religiosity in modern as well as historical contexts through a study of these elements in the book. Since, the language used in creation of texts and the literary texts thus produced create a new reality that questions the stereotypes of the past, and in turn often end up creating new stereotypes or stereotypical representations at times, the paper seeks to actively engage with the text in order to identify and study such stereotypes, along with their changing representations. Through a detailed examination of the book, the paper seeks to unravel whether some socio-cultural stereotypes existed earlier, and whether there is development of new stereotypes from Dalrymple’s point of view as an outsider writing on issues that are deeply rooted in the cultural milieu of the country. For this analysis, the paper takes help from the psycho-literary theories of stereotyping and representation.Keywords: stereotyping, representation, William Dalrymple, religion
Procedia PDF Downloads 3101023 A Protein-Wave Alignment Tool for Frequency Related Homologies Identification in Polypeptide Sequences
Authors: Victor Prevost, Solene Landerneau, Michel Duhamel, Joel Sternheimer, Olivier Gallet, Pedro Ferrandiz, Marwa Mokni
Abstract:
The search for homologous proteins is one of the ongoing challenges in biology and bioinformatics. Traditionally, a pair of proteins is thought to be homologous when they originate from the same ancestral protein. In such a case, their sequences share similarities, and advanced scientific research effort is spent to investigate this question. On this basis, we propose the Protein-Wave Alignment Tool (”P-WAT”) developed within the framework of the France Relance 2030 plan. Our work takes into consideration the mass-related wave aspect of protein biosynthesis, by associating specific frequencies to each amino acid according to its mass. Amino acids are then regrouped within their mass category. This way, our algorithm produces specific alignments in addition to those obtained with a common amino acid coding system. For this purpose, we develop the ”P-WAT” original algorithm, able to address large protein databases, with different attributes such as species, protein names, etc. that allow us to align user’s requests with a set of specific protein sequences. The primary intent of this algorithm is to achieve efficient alignments, in this specific conceptual frame, by minimizing execution costs and information loss. Our algorithm identifies sequence similarities by searching for matches of sub-sequences of different sizes, referred to as primers. Our algorithm relies on Boolean operations upon a dot plot matrix to identify primer amino acids common to both proteins which are likely to be part of a significant alignment of peptides. From those primers, dynamic programming-like traceback operations generate alignments and alignment scores based on an adjusted PAM250 matrix.Keywords: protein, alignment, homologous, Genodic
Procedia PDF Downloads 1131022 Stochastic Multicast Routing Protocol for Flying Ad-Hoc Networks
Authors: Hyunsun Lee, Yi Zhu
Abstract:
Wireless ad-hoc network is a decentralized type of temporary machine-to-machine connection that is spontaneous or impromptu so that it does not rely on any fixed infrastructure and centralized administration. As unmanned aerial vehicles (UAVs), also called drones, have recently become more accessible and widely utilized in military and civilian domains such as surveillance, search and detection missions, traffic monitoring, remote filming, product delivery, to name a few. The communication between these UAVs become possible and materialized through Flying Ad-hoc Networks (FANETs). However, due to the high mobility of UAVs that may cause different types of transmission interference, it is vital to design robust routing protocols for FANETs. In this talk, the multicast routing method based on a modified stochastic branching process is proposed. The stochastic branching process is often used to describe an early stage of an infectious disease outbreak, and the reproductive number in the process is used to classify the outbreak into a major or minor outbreak. The reproductive number to regulate the local transmission rate is adapted and modified for flying ad-hoc network communication. The performance of the proposed routing method is compared with other well-known methods such as flooding method and gossip method based on three measures; average reachability, average node usage and average branching factor. The proposed routing method achieves average reachability very closer to flooding method, average node usage closer to gossip method, and outstanding average branching factor among methods. It can be concluded that the proposed multicast routing scheme is more efficient than well-known routing schemes such as flooding and gossip while it maintains high performance.Keywords: Flying Ad-hoc Networks, Multicast Routing, Stochastic Branching Process, Unmanned Aerial Vehicles
Procedia PDF Downloads 1231021 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms
Authors: Francisco M. Silva
Abstract:
Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare
Procedia PDF Downloads 1261020 Pilot-free Image Transmission System of Joint Source Channel Based on Multi-Level Semantic Information
Authors: Linyu Wang, Liguo Qiao, Jianhong Xiang, Hao Xu
Abstract:
In semantic communication, the existing joint Source Channel coding (JSCC) wireless communication system without pilot has unstable transmission performance and can not effectively capture the global information and location information of images. In this paper, a pilot-free image transmission system of joint source channel based on multi-level semantic information (Multi-level JSCC) is proposed. The transmitter of the system is composed of two networks. The feature extraction network is used to extract the high-level semantic features of the image, compress the information transmitted by the image, and improve the bandwidth utilization. Feature retention network is used to preserve low-level semantic features and image details to improve communication quality. The receiver also is composed of two networks. The received high-level semantic features are fused with the low-level semantic features after feature enhancement network in the same dimension, and then the image dimension is restored through feature recovery network, and the image location information is effectively used for image reconstruction. This paper verifies that the proposed multi-level JSCC algorithm can effectively transmit and recover image information in both AWGN channel and Rayleigh fading channel, and the peak signal-to-noise ratio (PSNR) is improved by 1~2dB compared with other algorithms under the same simulation conditions.Keywords: deep learning, JSCC, pilot-free picture transmission, multilevel semantic information, robustness
Procedia PDF Downloads 1201019 An Inquiry about Perception of Autonomous Academe and Accountable Leadership on University Governance: A Case of Bangladesh
Authors: Monjur E-Khoda Tarafdar
Abstract:
Institutional autonomy and academic freedom corresponding to accountability are seen as a core concept of university governance. Universities are crucial factors in search of truth for knowledge production and dissemination. Academic leaders are the pivots to progressively influence the university governance. Therefore, in a continuum of debate about autonomy and accountability in the aspect of perception, academic leadership has been studied. Having longstanding acquaintance in the field the researcher has been instrumental to gain lived experiences of the informants in this qualitative study. Case studies are useful to gain an understanding of the complexities of a particular site to preserve a sense of wholeness of the site being investigated. Thus, multiple case study approach has been employed with a sample size of seventy-one. Such large size of informants was interviewed in order to capture a wider range of views that exist in Bangladesh. This qualitative multiple case study has engaged in-depth interviewing method of academic leaders and policy makers of three case universities. Open-ended semi-structured questionnaires are used to have a comprehensive understanding of how the perception of autonomy and accountability of academic leaders has impacted university governance in the context of Bangladesh. The paper has interpreted the voices of the informants and distinguished both the transformational and transactional style of academic leaderships in local university settings against the globally changed higher education demography. The study finds contextual dissimilarity in the perspectives of autonomy and accountability of academic leadership towards university governance. Unaccountability results in losing autonomous power and collapsing academic excellence. Since accountability grows competitiveness and competence, the paper also focuses on how academic leaders abuse the premise of academic loyalty to universities.Keywords: academic loyalty, accountability, autonomy, leadership, perception, university governance
Procedia PDF Downloads 3151018 Climate Change and Its Impacts: The Case of Coastal Fishing Communities of the Meghna River in South-Central Bangladesh
Authors: Md. Royhanur Islam, Thomas Cansse, Md. Sahidul Islam, Atiqur Rahman Sunny
Abstract:
The geographical location of Bangladesh makes it one of the most vulnerable countries to climate change. Climate-induced phenomena mainly affect the south-central region of Bangladesh (Laxmipur district) where they have begun to occur more frequently. The aim of the study was to identify the hydro-climatic factors that lead to weather-related disasters in the coastal areas and analyse the consequences of these factors on coastal livelihoods, with possible adaptation options using participatory rural appraisal (PRA) tools. The present study showed several disasters such as land erosion, depressions and cyclones, coastal flooding, storm surge, and precipitation. The frequency of these disasters is of a noticeable rate. Surveys have also discovered that land erosion is ongoing. Tidal water is being introduced directly into the mainland, and as a result of the salt intrusion, production capacity is declining. The coastal belt is an important area for fishing activities, but due to changed fishing times and a lack of Alternative Income Generating Activities (AIGAs), people have been forced to search for alternative livelihood options by taking both short-term and long-term adaptation options. Therefore, in order to increase awareness and minimize the losses, vulnerable communities must be fully incorporated into disaster response strategies. The government as well as national and international donor organizations should come forward and resolve the present situation of these vulnerable groups since otherwise, they will have to endure endless and miserable suffering due to the effects of climate change ahead in their lives.Keywords: adaptation, community, fishery development, livelihood
Procedia PDF Downloads 1211017 Symmetric Key Encryption Algorithm Using Indian Traditional Musical Scale for Information Security
Authors: Aishwarya Talapuru, Sri Silpa Padmanabhuni, B. Jyoshna
Abstract:
Cryptography helps in preventing threats to information security by providing various algorithms. This study introduces a new symmetric key encryption algorithm for information security which is linked with the "raagas" which means Indian traditional scale and pattern of music notes. This algorithm takes the plain text as input and starts its encryption process. The algorithm then randomly selects a raaga from the list of raagas that is assumed to be present with both sender and the receiver. The plain text is associated with the thus selected raaga and an intermediate cipher-text is formed as the algorithm converts the plain text characters into other characters, depending upon the rules of the algorithm. This intermediate code or cipher text is arranged in various patterns in three different rounds of encryption performed. The total number of rounds in the algorithm is equal to the multiples of 3. To be more specific, the outcome or output of the sequence of first three rounds is again passed as the input to this sequence of rounds recursively, till the total number of rounds of encryption is performed. The raaga selected by the algorithm and the number of rounds performed will be specified at an arbitrary location in the key, in addition to important information regarding the rounds of encryption, embedded in the key which is known by the sender and interpreted only by the receiver, thereby making the algorithm hack proof. The key can be constructed of any number of bits without any restriction to the size. A software application is also developed to demonstrate this process of encryption, which dynamically takes the plain text as input and readily generates the cipher text as output. Therefore, this algorithm stands as one of the strongest tools for information security.Keywords: cipher text, cryptography, plaintext, raaga
Procedia PDF Downloads 2891016 Evaluation of Machine Learning Algorithms and Ensemble Methods for Prediction of Students’ Graduation
Authors: Soha A. Bahanshal, Vaibhav Verdhan, Bayong Kim
Abstract:
Graduation rates at six-year colleges are becoming a more essential indicator for incoming fresh students and for university rankings. Predicting student graduation is extremely beneficial to schools and has a huge potential for targeted intervention. It is important for educational institutions since it enables the development of strategic plans that will assist or improve students' performance in achieving their degrees on time (GOT). A first step and a helping hand in extracting useful information from these data and gaining insights into the prediction of students' progress and performance is offered by machine learning techniques. Data analysis and visualization techniques are applied to understand and interpret the data. The data used for the analysis contains students who have graduated in 6 years in the academic year 2017-2018 for science majors. This analysis can be used to predict the graduation of students in the next academic year. Different Predictive modelings such as logistic regression, decision trees, support vector machines, Random Forest, Naïve Bayes, and KNeighborsClassifier are applied to predict whether a student will graduate. These classifiers were evaluated with k folds of 5. The performance of these classifiers was compared based on accuracy measurement. The results indicated that Ensemble Classifier achieves better accuracy, about 91.12%. This GOT prediction model would hopefully be useful to university administration and academics in developing measures for assisting and boosting students' academic performance and ensuring they graduate on time.Keywords: prediction, decision trees, machine learning, support vector machine, ensemble model, student graduation, GOT graduate on time
Procedia PDF Downloads 721015 DLtrace: Toward Understanding and Testing Deep Learning Information Flow in Deep Learning-Based Android Apps
Authors: Jie Zhang, Qianyu Guo, Tieyi Zhang, Zhiyong Feng, Xiaohong Li
Abstract:
With the widespread popularity of mobile devices and the development of artificial intelligence (AI), deep learning (DL) has been extensively applied in Android apps. Compared with traditional Android apps (traditional apps), deep learning based Android apps (DL-based apps) need to use more third-party application programming interfaces (APIs) to complete complex DL inference tasks. However, existing methods (e.g., FlowDroid) for detecting sensitive information leakage in Android apps cannot be directly used to detect DL-based apps as they are difficult to detect third-party APIs. To solve this problem, we design DLtrace; a new static information flow analysis tool that can effectively recognize third-party APIs. With our proposed trace and detection algorithms, DLtrace can also efficiently detect privacy leaks caused by sensitive APIs in DL-based apps. Moreover, using DLtrace, we summarize the non-sequential characteristics of DL inference tasks in DL-based apps and the specific functionalities provided by DL models for such apps. We propose two formal definitions to deal with the common polymorphism and anonymous inner-class problems in the Android static analyzer. We conducted an empirical assessment with DLtrace on 208 popular DL-based apps in the wild and found that 26.0% of the apps suffered from sensitive information leakage. Furthermore, DLtrace has a more robust performance than FlowDroid in detecting and identifying third-party APIs. The experimental results demonstrate that DLtrace expands FlowDroid in understanding DL-based apps and detecting security issues therein.Keywords: mobile computing, deep learning apps, sensitive information, static analysis
Procedia PDF Downloads 1791014 Alpha Lipoic Acid: An Antioxidant for Infertility
Authors: Chiara Di Tucci, Giulia Galati, Giulia Mattei, Valentina Bonanni, Oriana Capri, Renzo D'Amelio, Ludovico Muzii, Pierluigi Benedetti Panici
Abstract:
Objective: Infertility is an increasingly frequent health condition, which may depend on female or male factors. Oxidative stress (OS), resulting from a disrupted balance between reactive oxygen species (ROS) and protective antioxidants, affects the reproductive lifespan of men and women. In this review, we examine if alpha lipoic acid (ALA), among the oral supplements currently in use, has an evidence-based beneficial role in the context of female and male infertility. Methods: We performed a search from English literature using the PubMed database with the following keywords: 'female infertility', 'male infertility', 'semen', 'sperm', 'sub-fertile man', 'alpha-lipoic acid', ' alpha lipoic acid', 'lipoid acid', 'endometriosis', 'chronic pelvic pain', 'follicular fluid' and 'oocytes'. We included clinical trials, multicentric studies, and reviews. The total number of references found after automatically and manually excluding duplicates was 180. After the primary and secondary screening, 28 articles were selected. Results: The available literature demonstrates the positive effects of ALA in multiple processes, from oocyte maturation (0.87 ± 0.9% of oocyte in MII vs 0.81 ± 3.9%; p < .05) to fertilization, embryo development (57.7% vs 75.7% grade 1 embryo; p < .05) and reproductive outcomes. Its regular administration both in sub-fertile women and men has been shown to reduce pelvic pain in endometriosis (p < .05), regularize menstrual flow and metabolic disorders (p < .01), and improve sperm quality (p < .001). Conclusions: ALA represents a promising new molecule in the field of couple infertility. More clinical studies are needed in order to enhance its use in clinical practice.Keywords: alpha lipoic acid, endometriosis, infertility, male factor, polycystic ovary syndrome
Procedia PDF Downloads 861013 A Machine Learning-Based Model to Screen Antituberculosis Compound Targeted against LprG Lipoprotein of Mycobacterium tuberculosis
Authors: Syed Asif Hassan, Syed Atif Hassan
Abstract:
Multidrug-resistant Tuberculosis (MDR-TB) is an infection caused by the resistant strains of Mycobacterium tuberculosis that do not respond either to isoniazid or rifampicin, which are the most important anti-TB drugs. The increase in the occurrence of a drug-resistance strain of MTB calls for an intensive search of novel target-based therapeutics. In this context LprG (Rv1411c) a lipoprotein from MTB plays a pivotal role in the immune evasion of Mtb leading to survival and propagation of the bacterium within the host cell. Therefore, a machine learning method will be developed for generating a computational model that could predict for a potential anti LprG activity of the novel antituberculosis compound. The present study will utilize dataset from PubChem database maintained by National Center for Biotechnology Information (NCBI). The dataset involves compounds screened against MTB were categorized as active and inactive based upon PubChem activity score. PowerMV, a molecular descriptor generator, and visualization tool will be used to generate the 2D molecular descriptors for the actives and inactive compounds present in the dataset. The 2D molecular descriptors generated from PowerMV will be used as features. We feed these features into three different classifiers, namely, random forest, a deep neural network, and a recurring neural network, to build separate predictive models and choosing the best performing model based on the accuracy of predicting novel antituberculosis compound with an anti LprG activity. Additionally, the efficacy of predicted active compounds will be screened using SMARTS filter to choose molecule with drug-like features.Keywords: antituberculosis drug, classifier, machine learning, molecular descriptors, prediction
Procedia PDF Downloads 3911012 The Effect of Pregabalin on Postoperative Pain after Anterior Cruciate Ligament Reconstruction: A Systematic Review of Randomized Clinical Trials
Authors: Emad Kouhestani
Abstract:
Background: Despite the enormous success of anterior cruciate ligament (ACL) reconstruction, acute neuropathic pain can develop postoperatively and is both distressing and difficult to treat once established. Pregabalin, as an anticonvulsant agent that selectively affects the nociceptive process, has been used as a pain relief agent. The purpose of this systematic review of randomized controlled trials (RCTs) was to evaluate the pain control effect of pregabalin versus placebo after ACL reconstruction. Method: A search of the literature was performed from inception to June 2022, using PubMed, Scopus, Google Scholar, Web of Science, Cochrane, and EBSCO. Studies considered for inclusion were RCTs that reported relevant outcomes (postoperative pain scores, or cumulative opioid consumption, adverse events) following the administration of pregabalin in patients undergoing ACL reconstruction. Result: Five placebo-controlled RCTs involving 272 participants met the inclusion criteria. 75 mg and 150 mg of oral pregabalin were used in included trials. Two studies used a single dose of pregabalin one hour before anesthesia induction. Two studies used pregabalin 1 hour before anesthesia induction and 12 hours after. One study used daily pregabalin 7 days before and 7 days after surgery. Out of five papers, three papers found significantly lower pain intensity and cumulative opioid consumption in the pregabalin group compared with the placebo group. However, a decrease in pain scores was found in all trials. Pregabalin administration was associated with dizziness and nausea. Conclusion: The use of pregabalin may be a valuable asset in pain management after ACL reconstruction. However, future studies with larger sample sizes and longer follow-up periods are required.Keywords: pregabalin, anterior cruciate ligament, postoperative pain, clinical trial
Procedia PDF Downloads 931011 Performance Evaluation and Comparison between the Empirical Mode Decomposition, Wavelet Analysis, and Singular Spectrum Analysis Applied to the Time Series Analysis in Atmospheric Science
Authors: Olivier Delage, Hassan Bencherif, Alain Bourdier
Abstract:
Signal decomposition approaches represent an important step in time series analysis, providing useful knowledge and insight into the data and underlying dynamics characteristics while also facilitating tasks such as noise removal and feature extraction. As most of observational time series are nonlinear and nonstationary, resulting of several physical processes interaction at different time scales, experimental time series have fluctuations at all time scales and requires the development of specific signal decomposition techniques. Most commonly used techniques are data driven, enabling to obtain well-behaved signal components without making any prior-assumptions on input data. Among the most popular time series decomposition techniques, most cited in the literature, are the empirical mode decomposition and its variants, the empirical wavelet transform and singular spectrum analysis. With increasing popularity and utility of these methods in wide ranging applications, it is imperative to gain a good understanding and insight into the operation of these algorithms. In this work, we describe all of the techniques mentioned above as well as their ability to denoise signals, to capture trends, to identify components corresponding to the physical processes involved in the evolution of the observed system and deduce the dimensionality of the underlying dynamics. Results obtained with all of these methods on experimental total ozone columns and rainfall time series will be discussed and comparedKeywords: denoising, empirical mode decomposition, singular spectrum analysis, time series, underlying dynamics, wavelet analysis
Procedia PDF Downloads 1161010 Experiences of Homophobia, Machismo and Misogyny in Tourist Destinations: A Netnography in a Facebook Community of LGBT Backpackers
Authors: Renan De Caldas Honorato, Ana Augusta Ferreira De Freitas
Abstract:
Homosexuality is still criminalized in a large number of countries. In some of them, being gay or lesbian can even be punished by death. Added to this context, the experiences of social discrimination faced by the LGBT population, including homophobia, machismo and misogyny, cause numerous restrictions throughout their lives. The possibility of confronting these challenges in moments that should be pleasant, such as on a trip or on vacation, is unpleasant, to say the least. In the current scenario of intensifying the use of Social network sites (SNSs) to search for information, including in the tourist area, this work aims to analyze the sharing of tourist experiences with situations of confrontation and perceptions of homophobia, machismo and misogyny, and restrictions suffered in tourist destinations. The fieldwork is a community of LGBT backpackers based on Facebook. Netnography was the core method adopted. A qualitative approach was conducted and 463 publications posted from January to December 2020 were assessed through the computer-mediated discourse analysis (CMDA). The results suggest that these publications exist to identify the potential exposure to these offensive behaviors while traveling. Individuals affirm that the laws, positive or not, in relation to the LGBT public are not the only factors for a place to be defined as safe or not for gay travelers. The social situation of a country and its laws are quite different and this is the main target of these publications. The perception of others about the chosen destination is more important than knowing your rights and the legal status of each country and it also lessens uncertainty, even when they are never totally confident when choosing a travel destination. In certain circumstances, sexual orientation also needs to be protected from the judgment of hosts and residents. The systemic treatment of homophobic behavior and the construction of a more inclusive society are urgent.Keywords: homophobia, hospitality, machismo, misogyny
Procedia PDF Downloads 1881009 Using Geo-Statistical Techniques and Machine Learning Algorithms to Model the Spatiotemporal Heterogeneity of Land Surface Temperature and its Relationship with Land Use Land Cover
Authors: Javed Mallick
Abstract:
In metropolitan areas, rapid changes in land use and land cover (LULC) have ecological and environmental consequences. Saudi Arabia's cities have experienced tremendous urban growth since the 1990s, resulting in urban heat islands, groundwater depletion, air pollution, loss of ecosystem services, and so on. From 1990 to 2020, this study examines the variance and heterogeneity in land surface temperature (LST) caused by LULC changes in Abha-Khamis Mushyet, Saudi Arabia. LULC was mapped using the support vector machine (SVM). The mono-window algorithm was used to calculate the land surface temperature (LST). To identify LST clusters, the local indicator of spatial associations (LISA) model was applied to spatiotemporal LST maps. In addition, the parallel coordinate (PCP) method was used to investigate the relationship between LST clusters and urban biophysical variables as a proxy for LULC. According to LULC maps, urban areas increased by more than 330% between 1990 and 2018. Between 1990 and 2018, built-up areas had an 83.6% transitional probability. Furthermore, between 1990 and 2020, vegetation and agricultural land were converted into built-up areas at a rate of 17.9% and 21.8%, respectively. Uneven LULC changes in built-up areas result in more LST hotspots. LST hotspots were associated with high NDBI but not NDWI or NDVI. This study could assist policymakers in developing mitigation strategies for urban heat islandsKeywords: land use land cover mapping, land surface temperature, support vector machine, LISA model, parallel coordinate plot
Procedia PDF Downloads 781008 Information Literacy Skills of Legal Practitioners in Khyber Pakhtunkhwa-Pakistan: An Empirical Study
Authors: Saeed Ullah Jan, Shaukat Ullah
Abstract:
Purpose of the study: The main theme of this study is to explore the information literacy skills of the law practitioners in Khyber Pakhtunkhwa-Pakistan under the heading "Information Literacy Skills of Legal Practitioners in Khyber Pakhtunkhwa-Pakistan: An Empirical Study." Research Method and Procedure: To conduct this quantitative study, the simple random sample approach is used. An adapted questionnaire is distributed among 254 lawyers of Dera Ismail Khan through personal visits and electronic means. The data collected is analyzed through SPSS (Statistical Package for Social Sciences) software. Delimitations of the study: The study is delimited to the southern district of Khyber Pakhtunkhwa: Dera Ismael Khan. Key Findings: Most of the lawyers of District Dera Ismail Khan of Khyber Pakhtunkhwa can recognize and understand the needed information. A large number of lawyers are capable of presenting information in both written and electronic forms. They are not comfortable with different legal databases and using various searching and keyword techniques. They have less knowledge of Boolean operators for locating online information. Conclusion and Recommendations: Efforts should be made to arrange refresher courses and training workshops on the utilization of different legal databases and different search techniques for retrieval of information sources. This practice will enhance the information literacy skills of lawyers, which will ultimately result in a better legal system in Pakistan. Practical implication(s): The findings of the study will motivate the policymakers and authorities of legal forums to restructure the information literacy programs to fulfill the lawyers' information needs. Contribution to the knowledge: No significant work has been done on the lawyers' information literacy skills in Khyber Pakhtunkhwa-Pakistan. It will bring a clear picture of the information literacy skills of law practitioners and address the problems faced by them during the seeking process.Keywords: information literacy-Pakistan, infromation literacy-lawyers, information literacy-lawyers-KP, law practitioners-Pakistan
Procedia PDF Downloads 1491007 The Effect of Colloidal Metals Nanoparticles on Quarantine Bacterium - Clavibacter michiganensis Ssp. sepedonicus
Authors: Włodzimierz Przewodowski, Agnieszka Przewodowska
Abstract:
Colloidal metal nanoparticles have drawn increasing attention in the field of phytopathology because of their unique properties and possibilities of applications. Their antibacterial activity, no induction of the development of pathogen resistance and the ability to penetrate most of biological barriers make them potentially useful in the fighting against dangerous pathogens. These properties are very important in the case of protection of strategic crops in the world, like potato - fourth crop in the world - which is host to numerous pathogenic microorganisms causing serious diseases, significantly affecting yield and causing the economic losses. One of the most important and difficult to reduce pathogen of potato plant is quarantine bacterium Clavibacter michiganensis ssp. sepedonicus (Cms) responsible for ring rot disease. Control and detection of these pathogens is very complicated. Application of healthy, certified seed material as well as hygiene in potato production and storage are the most efficient ways of preventing of ring rot disease. Currently used disinfectants and pesticides, have many disadvantages, such as toxicity, low efficiency, selectivity, corrosiveness, and the inability to eliminate the pathogens in potato tissue. In this situation, it becomes important to search for new formulations based on components harmful to health, yet efficient, stable during prolonged period of time and a with wide range of biocide activity. Such capabilities are offered by the latest generation of biocidal nanoparticles such as colloidal metals. Therefore the aim of the presented research was to develop newly antibacterial preparation based on colloidal metal nanoparticles and checking their influence on the Cms bacteria. Our preliminary results confirmed high efficacy of the nano-colloids in controlling the this selected pathogen.Keywords: clavibacter michiganensis ssp. sepedonicus, colloidal metal nanoparticles, phytopathology, bacteria
Procedia PDF Downloads 272