Search results for: Privacy and Data Protection Law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26334

Search results for: Privacy and Data Protection Law

22404 Energy Service Companies as a Facilitator for Implementation of Energy-Environment Conventions

Authors: Bahareh Arghand

Abstract:

The establishment of rules and regulations for more effective energy-environment interactions are essential to achieving sustainable development. Sustainable development requires mechanisms that can promote compliance in energy-environment conventions. There are many binding agreements and non-binding instruments at regional and international levels on energy and the environment. These conventions try to decrease conflicts of interest between energy, environment and economic by legal principles and practical mechanisms. The major core of conventions is their implementations because the poor implementation and enforcement power affect their success. In this regard, the main goal of this study is proposing the effective implementation mechanisms. Energy service companies' (ESCOs) activities can improve energy efficiency and decrease the environmental degradations. Therefore, it can be proposed and assessed the merit mechanism of ESCO performance as a facilitator to implement energy-environment conventions. An assessment of ESCO performance, including its potentials, problems, and limitations, as a facilitator for effective implementation of the energy-environment convention, is included. This study is oriented towards effective development and application of laws and the function of ESCOs as appropriate economic instruments and facilitator for implementation of energy-environment conventions. The resulting system of close cooperation between the energy-environment conventions and ESCOs is geared toward advancing environmental protection and economic factors by the transfer of environmentally-sound technologies that meet sustainable development objectives.

Keywords: energy-environment conventions, energy service company, facilitator mechanism, sustainable development

Procedia PDF Downloads 165
22403 Annexing the Strength of Information and Communication Technology (ICT) for Real-time TB Reporting Using TB Situation Room (TSR) in Nigeria: Kano State Experience

Authors: Ibrahim Umar, Ashiru Rajab, Sumayya Chindo, Emmanuel Olashore

Abstract:

INTRODUCTION: Kano is the most populous state in Nigeria and one of the two states with the highest TB burden in the country. The state notifies an average of 8,000+ TB cases quarterly and has the highest yearly notification of all the states in Nigeria from 2020 to 2022. The contribution of the state TB program to the National TB notification varies from 9% to 10% quarterly between the first quarter of 2022 and second quarter of 2023. The Kano State TB Situation Room is an innovative platform for timely data collection, collation and analysis for informed decision in health system. During the 2023 second National TB Testing week (NTBTW) Kano TB program aimed at early TB detection, prevention and treatment. The state TB Situation room provided avenue to the state for coordination and surveillance through real time data reporting, review, analysis and use during the NTBTW. OBJECTIVES: To assess the role of innovative information and communication technology platform for real-time TB reporting during second National TB Testing week in Nigeria 2023. To showcase the NTBTW data cascade analysis using TSR as innovative ICT platform. METHODOLOGY: The State TB deployed a real-time virtual dashboard for NTBTW reporting, analysis and feedback. A data room team was set up who received realtime data using google link. Data received was analyzed using power BI analytic tool with statistical alpha level of significance of <0.05. RESULTS: At the end of the week-long activity and using the real-time dashboard with onsite mentorship of the field workers, the state TB program was able to screen a total of 52,054 people were screened for TB from 72,112 individuals eligible for screening (72% screening rate). A total of 9,910 presumptive TB clients were identified and evaluated for TB leading to diagnosis of 445 TB patients with TB (5% yield from presumptives) and placement of 435 TB patients on treatment (98% percentage enrolment). CONCLUSION: The TB Situation Room (TBSR) has been a great asset to Kano State TB Control Program in meeting up with the growing demand for timely data reporting in TB and other global health responses. The use of real time surveillance data during the 2023 NTBTW has in no small measure improved the TB response and feedback in Kano State. Scaling up this intervention to other disease areas, states and nations is a positive step in the right direction towards global TB eradication.

Keywords: tuberculosis (tb), national tb testing week (ntbtw), tb situation rom (tsr), information communication technology (ict)

Procedia PDF Downloads 51
22402 Effect of Coronary Insulators in Increasing the Lifespan of Electrolytic Cells: Short-circuit and Heat Resistance

Authors: Robert P. Dufresne, Hamid Arabzadeh

Abstract:

The current study investigates the effectiveness of a new form of permanent baseboard insulators with an umbrella action, hereinafter referred to as Coronary Insulator, in supporting and protecting the assembly of electrodes immersed in an electrolytic cell and in increasing the lifespan of the lateral sides of the electrolytic cell, in both electro-winning and electro-refinery method. The advantages of using a coronary insulator in addition to the top capping board (equipotential insulator) were studied compared to the conventional assembly of an electrolytic cell. Then, a thermal imaging technique was utilized during high-temperature thermal (heat transfer) tests for sample cell walls with and without coronary insulators in their assembly to show the effectiveness of coronary insulators in protecting the cell wall under extreme conditions. It was shown that, unlike the conventional assembly, which is highly prone to damages to the cell wall under thermal shocks, the presence of coronary insulator can significantly increase the level of protection of the cell due to their ultra-high thermal and chemical resistance, as well as decreasing the replacement frequency of insulators to almost zero. Besides, the results of the study showed that the test assembly with the coronary insulator provides better consistency in positioning and, subsequently, better contact, compared to the conventional method, which reduces the chance of electric short-circuit in the system.

Keywords: capping board, coronary insulator, electrolytic cell, thermal shock.

Procedia PDF Downloads 174
22401 Density Measurement of Mixed Refrigerants R32+R1234yf and R125+R290 from 0°C to 100°C and at Pressures up to 10 MPa

Authors: Xiaoci Li, Yonghua Huang, Hui Lin

Abstract:

Optimization of the concentration of components in mixed refrigerants leads to potential improvement of either thermodynamic cycle performance or safety performance of heat pumps and refrigerators. R32+R1234yf and R125+R290 are two promising binary mixed refrigerants for the application of heat pumps working in the cold areas. The p-ρ-T data of these mixtures are one of the fundamental and necessary properties for design and evaluation of the performance of the heat pumps. Although the property data of mixtures can be predicted by the mixing models based on the pure substances incorporated in programs such as the NIST database Refprop, direct property measurement will still be helpful to reveal the true state behaviors and verify the models. Densities of the mixtures of R32+R1234yf an d R125+R290 are measured by an Anton Paar U shape oscillating tube digital densimeter DMA-4500 in the range of temperatures from 0°C to 100 °C and pressures up to 10 MPa. The accuracy of the measurement reaches 0.00005 g/cm³. The experimental data are compared with the predictions by Refprop in the corresponding range of pressure and temperature.

Keywords: mixed refrigerant, density measurement, densimeter, thermodynamic property

Procedia PDF Downloads 282
22400 Protein Stabilized Foam Structures as Protective Carrier Systems during Microwave Drying of Probiotics

Authors: Jannika Dombrowski, Sabine Ambros, Ulrich Kulozik

Abstract:

Due to the increasing popularity of healthy products, probiotics are still of rising importance in food manufacturing. With the aim to amplify the field of probiotic application to non-chilled products, the cultures have to be preserved by drying. Microwave drying has proved to be a suitable technique to achieve relatively high survival rates, resulting from drying at gentle temperatures, among others. However, diffusion limitation due to compaction of cell suspension during drying can prolong drying times as well as deteriorate product properties (grindability, rehydration performance). Therefore, we aimed to embed probiotics in an aerated matrix of whey proteins (surfactants) and di-/polysaccharides (foam stabilization, probiotic protection) during drying. As a result of the manifold increased inner surface of the cell suspension, drying performance was enhanced significantly as compared to non-foamed suspensions. This work comprises investigations on suitable foam matrices, being stable under vacuum (variation of protein concentration, type and concentration of di-/polysaccharide) as well as development of an applicable microwave drying process in terms of microwave power, chamber pressure and maximum product temperatures. Performed analyses included foam characteristics (overrun, drainage, firmness, bubble sizes), and properties of the dried cultures (survival, activity). In addition, efficiency of the drying process was evaluated.

Keywords: foam structure, microwave drying, polysaccharides, probiotics

Procedia PDF Downloads 249
22399 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: data envelopment analysis, interval DEA, efficiency classification, efficiency prediction

Procedia PDF Downloads 157
22398 Analysis of Spatial Form and Gene of Historical and Cultural Settlements in Mountainous Areas: Illustrated by the Example of Anju Ancient Town

Authors: Sun Gang

Abstract:

A variety of functional spaces are distributed on the vast mountain waterfront. Their functional positioning presents a spontaneous form of settlement space, and the construction features show a passive impact on the natural environment. As the precious heritage of inheriting human civilization and promoting historical culture, the traditional settlement space in mountainous areas is also the local expression of landscape pattern pattern gene. Under the impact of rapid urban construction and the stimulation of the transformation of social consumption demand, the original texture, scale and ecology of the traditional mountain settlement space, especially the historical and cultural settlement space, have been affected, and the decline of characteristics hinders the development. This paper selects Anju Ancient Town, the fourth largest ancient city in China, which is located in the city of mountains and waters as the research object, and combines spatial analysis and other methods to study the characteristics and causes of its spatial morphology, analyze the internal logic in its formation and development process, build a genetic analysis map, explore the possibility of settlement inheritance and development, and provide reference for the construction, protection and inheritance of traditional mountain settlements.

Keywords: mountain traditional settlement, historical and cultural settlement space, spatial form, spatial gene

Procedia PDF Downloads 77
22397 Mapping of Traffic Noise in Riyadh City-Saudi Arabia

Authors: Khaled A. Alsaif, Mosaad A. Foda

Abstract:

The present work aims at development of traffic noise maps for Riyadh City using the software Lima. Road traffic data were estimated or measured as accurate as possible in order to obtain consistent noise maps. The predicted noise levels at some selected sites are validated by actual field measurements, which are obtained by a system that consists of a sound level meter, a GPS receiver and a database to manage the measured data. The maps show that noise levels remain over 50 dBA and can exceed 70 dBA at the nearside of major roads and highways.

Keywords: noise pollution, road traffic noise, LimA predictor, GPS

Procedia PDF Downloads 368
22396 Lockit: A Logic Locking Automation Software

Authors: Nemanja Kajtez, Yue Zhan, Basel Halak

Abstract:

The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).

Keywords: design automation, hardware security, IP piracy, logic locking

Procedia PDF Downloads 164
22395 Polish Authorities Towards Refugee Crises

Authors: Klaudia Gołębiowska

Abstract:

This article analyzes the actions of Poland's ruling party facing two refugee crises. These crises emerged almost one after the other within a few months. The first concerned irregular migrants from various countries, including the Middle East, seeking to cross the Polish border from the territory of Belarus. The second was caused by Russia's full-scale invasion of Ukraine. I aim to show the evolution of the discourse and law towards immigrants and refugees by the party Prawo i Sprawiedliwość (PiS, ang. Law and Justice), which has been in power in Poland since 2015. The authorities, in power since 2015, have radically changed its anti-immigrant discourse towards the exodus of civilians from Ukraine. Research questions are the following: What were the roots of the refugee crises in Poland in 2021 and 2022? What legal or illegal measures were taken in Poland to deal with the refugee crises? The methods of qualitative source analysis and process tracing. From the first days of the war in Ukraine, not only was aid organised for Ukrainians, but they were also given access to public services and education. All refugees were granted temporary international protection. At the same time, the basic physiological needs of those on the Polish-Belarusian border were ignored. Moreover, illegal pushbacks were used against those coming mainly from the Middle East, pushing them into the territory of Belarus, where they were often subjected to torture and inhumane treatment. The Polish government justified such treatment on the grounds that these people were part of a 'hybrid war' waged by Russia and Belarus using migrants. Only Ukrainians were treated as 'real' refugees in the analyzed crises at the Polish borders.

Keywords: refugee, irregular migrants, hybrid war, migrants

Procedia PDF Downloads 50
22394 The Introduction of a Tourniquet Checklist to Identify and Record Tourniquet Related Complications

Authors: Akash Soogumbur

Abstract:

Tourniquets are commonly used in orthopaedic surgery to provide hemostasis during procedures on the upper and lower limbs. However, there is a risk of complications associated with tourniquet use, such as nerve damage, skin necrosis, and compartment syndrome. The British Orthopaedic Association (BOAST) guidelines recommend the use of tourniquets at a pressure of 300 mmHg or less for a maximum of 2 hours. Research Aim: The aim of this study was to evaluate the effectiveness of a tourniquet checklist in improving compliance with the BOAST guidelines. Methodology: This was a retrospective study of all orthopaedic procedures performed at a single institution over a 12-month period. The study population included patients who had a tourniquet applied during surgery. Data were collected from the patients' medical records, including the duration of tourniquet use, the pressure used, and the method of exsanguination. Findings: The results showed that the use of the tourniquet checklist significantly improved compliance with the BOAST guidelines. Prior to the introduction of the checklist, compliance with the guidelines was 83% for the duration of tourniquet use and 73% for pressure used. After the introduction of the checklist, compliance increased to 100% for both duration of tourniquet use and pressure used. Theoretical Importance: The findings of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use. Data Collection: Data were collected from the patients' medical records. The data included the following information: Patient demographics, procedure performed, duration of tourniquet use, pressure used, method of exsanguination. Analysis Procedures: The data were analyzed using descriptive statistics. The compliance with the BOAST guidelines was calculated as the percentage of patients who met the guidelines for the duration of tourniquet use and pressure used. Question Addressed: The question addressed by this study was whether the use of a tourniquet checklist could improve compliance with the BOAST guidelines. Conclusion: The results of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use.

Keywords: tourniquet, pressure, duration, complications, surgery

Procedia PDF Downloads 52
22393 Data Analysis for Taxonomy Prediction and Annotation of 16S rRNA Gene Sequences from Metagenome Data

Authors: Suchithra V., Shreedhanya, Kavya Menon, Vidya Niranjan

Abstract:

Skin metagenomics has a wide range of applications with direct relevance to the health of the organism. It gives us insight to the diverse community of microorganisms (the microbiome) harbored on the skin. In the recent years, it has become increasingly apparent that the interaction between skin microbiome and the human body plays a prominent role in immune system development, cancer development, disease pathology, and many other biological implications. Next Generation Sequencing has led to faster and better understanding of environmental organisms and their mutual interactions. This project is studying the human skin microbiome of different individuals having varied skin conditions. Bacterial 16S rRNA data of skin microbiome is downloaded from SRA toolkit provided by NCBI to perform metagenomics analysis. Twelve samples are selected with two controls, and 3 different categories, i.e., sex (male/female), skin type (moist/intermittently moist/sebaceous) and occlusion (occluded/intermittently occluded/exposed). Quality of the data is increased using Cutadapt, and its analysis is done using FastQC. USearch, a tool used to analyze an NGS data, provides a suitable platform to obtain taxonomy classification and abundance of bacteria from the metagenome data. The statistical tool used for analyzing the USearch result is METAGENassist. The results revealed that the top three abundant organisms found were: Prevotella, Corynebacterium, and Anaerococcus. Prevotella is known to be an infectious bacterium found on wound, tooth cavity, etc. Corynebacterium and Anaerococcus are opportunist bacteria responsible for skin odor. This result infers that Prevotella thrives easily in sebaceous skin conditions. Therefore it is better to undergo intermittently occluded treatment such as applying ointments, creams, etc. to treat wound for sebaceous skin type. Exposing the wound should be avoided as it leads to an increase in Prevotella abundance. Moist skin type individuals can opt for occluded or intermittently occluded treatment as they have shown to decrease the abundance of bacteria during treatment.

Keywords: bacterial 16S rRNA , next generation sequencing, skin metagenomics, skin microbiome, taxonomy

Procedia PDF Downloads 159
22392 Development of a Predictive Model to Prevent Financial Crisis

Authors: Tengqin Han

Abstract:

Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.

Keywords: delinquency, mortgage, model development, model validation

Procedia PDF Downloads 212
22391 Self-Supervised Learning for Hate-Speech Identification

Authors: Shrabani Ghosh

Abstract:

Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.

Keywords: attention learning, language model, offensive language detection, self-supervised learning

Procedia PDF Downloads 92
22390 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark

Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos

Abstract:

This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.

Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark

Procedia PDF Downloads 98
22389 The Developing of Teaching Materials Online for Students in Thailand

Authors: Pitimanus Bunlue

Abstract:

The objectives of this study were to identify the unique characteristics of Salaya Old market, Phutthamonthon, Nakhon Pathom and develop the effective video media to promote the homeland awareness among local people and the characteristic features of this community were collectively summarized based on historical data, community observation, and people’s interview. The acquired data were used to develop a media describing prominent features of the community. The quality of the media was later assessed by interviewing local people in the old market in terms of content accuracy, video, and narration qualities, and sense of homeland awareness after watching the video. The result shows a 6-minute video media containing historical data and outstanding features of this community was developed. Based on the interview, the content accuracy was good. The picture quality and the narration were very good. Most people developed a sense of homeland awareness after watching the video also as well.

Keywords: audio-visual, creating homeland awareness, Phutthamonthon Nakhon Pathom, research and development

Procedia PDF Downloads 281
22388 A Decision Support System for the Detection of Illicit Substance Production Sites

Authors: Krystian Chachula, Robert Nowak

Abstract:

Manufacturing home-made explosives and synthetic drugs is an increasing problem in Europe. To combat that, a data fusion system is proposed for the detection and localization of production sites in urban environments. The data consists of measurements of properties of wastewater performed by various sensors installed in a sewage network. A four-stage fusion strategy allows detecting sources of waste products from known chemical reactions. First, suspicious measurements are used to compute the amount and position of discharged compounds. Then, this information is propagated through the sewage network to account for missing sensors. The next step is clustering and the formation of tracks. Eventually, tracks are used to reconstruct discharge events. Sensor measurements are simulated by a subsystem based on real-world data. In this paper, different discharge scenarios are considered to show how the parameters of used algorithms affect the effectiveness of the proposed system. This research is a part of the SYSTEM project (SYnergy of integrated Sensors and Technologies for urban sEcured environMent).

Keywords: continuous monitoring, information fusion and sensors, internet of things, multisensor fusion

Procedia PDF Downloads 103
22387 Implementation of CNV-CH Algorithm Using Map-Reduce Approach

Authors: Aishik Deb, Rituparna Sinha

Abstract:

We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.

Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing

Procedia PDF Downloads 118
22386 Inferring Human Mobility in India Using Machine Learning

Authors: Asra Yousuf, Ajaykumar Tannirkulum

Abstract:

Inferring rural-urban migration trends can help design effective policies that promote better urban planning and rural development. In this paper, we describe how machine learning algorithms can be applied to predict internal migration decisions of people. We consider data collected from household surveys in Tamil Nadu to train our model. To measure the performance of the model, we use data on past migration from National Sample Survey Organisation of India. The factors for training the model include socioeconomic characteristic of each individual like age, gender, place of residence, outstanding loans, strength of the household, etc. and his past migration history. We perform a comparative analysis of the performance of a number of machine learning algorithm to determine their prediction accuracy. Our results show that machine learning algorithms provide a stronger prediction accuracy as compared to statistical models. Our goal through this research is to propose the use of data science techniques in understanding human decisions and behaviour in developing countries.

Keywords: development, migration, internal migration, machine learning, prediction

Procedia PDF Downloads 257
22385 Temperament as a Success Determinant in Formative Assessment

Authors: George Fomunyam Kehdinga

Abstract:

Assessment is a vital part of the educational process, and formative assessment is a way of ensuring that higher education achieves the desired effects. Different factors influence how students perform in assessments in general, and formative assessment in particular and temperament is one of such determining factors. This paper which is a qualitative case study of four universities in four different countries examines how the temperamental make up of students either empowers them to perform excellently in formative assessment or incapacitates their performance. These four universities were chosen from Cameroon, South Africa, United Kingdom and the United States of America and three students were chosen from each institution, six of which were undergraduate student and six postgraduate students. Data in this paper was generated through qualitative interviews and document analyses which was preceded by a temperament test. From the data generated, it was discovered that cholerics who are natural leaders, hence do not struggle to express themselves often perform excellently in formative assessment while sanguines on the other hand who are also extroverts like cholerics perform relatively well. Phlegmatics and melancholics performed averagely and poorly respectively in formative assessment because they are naturally prone to fear and hate such activities because they like keeping to themselves. The paper, therefore, suggest that temperament is a success determinant in formative assessment. It also proposes that lecturers need and understanding of temperaments to be able to fully administer formative assessment in the lecturer room. It also suggests that assessment should be balance in the classroom so that some students because of their temperamental make-up are not naturally disadvantaged while others are performing excellently. Lastly, the paper suggests that since formative assessment is a process of generating data, it should be contextualised or given and individualised approach so as to ensure that trustworthy data is generated.

Keywords: temperament, formative assessment, academic success, students

Procedia PDF Downloads 237
22384 Effect of Measured and Calculated Static Torque on Instantaneous Torque Profile of Switched Reluctance Motor

Authors: Ali Asghar Memon

Abstract:

The simulation modeling of switched reluctance (SR) machine often relies and uses the three data tables identified as static torque characteristics that include flux linkage characteristics, co energy characteristics and static torque characteristics separately. It has been noticed from the literature that the data of static torque used in the simulation model is often calculated so far the literature is concerned. This paper presents the simulation model that include the data of measured and calculated static torque separately to see its effect on instantaneous torque profile of the machine. This is probably for the first time so far the literature review is concerned that static torque from co energy information, and measured static torque directly from experiments are separately used in the model. This research is helpful for accurate modeling of switched reluctance drive.

Keywords: static characteristics, current chopping, flux linkage characteristics, switched reluctance motor

Procedia PDF Downloads 278
22383 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 205
22382 The Causality between Corruption and Economic Growth in MENA Countries: A Dynamic Panel-Data Analysis

Authors: Nour Mohamad Fayad

Abstract:

Complex and extensively researched, the impact of corruption on economic growth seems to be intricate. Many experts believe that corruption reduces economic development. However, counterarguments have suggested that corruption either promotes growth and development or has no significant impact on economic performance. Clearly, there is no consensus in the economics literature regarding the possible relationship between corruption and economic development. Corruption's complex and clandestine nature, which makes it difficult to define and measure, is one of the obstacles that must be overcome when investigating its effect on an economy. In an attempt to contribute to the ongoing debate, this study examines the impact of corruption on economic growth in the Middle East and North Africa (MENA) region between 2000 and 2021 using a Customized Corruption Index-CCI and panel data on MENA countries. These countries were selected because they are understudied in the economic literature, and despite the World Bank's recent emphasis on corruption in the developing world, the MENA countries have received little attention. The researcher used Cobb-Douglas functional form to test corruption in MENA using a customized index known as Customized Corruption Index-CCI to track corruption over almost 20 years, then used the dynamic panel data. The findings indicate that there is a positive correlation between corruption and economic growth, but this is not consistent across all MENA nations. First, the relatively recent lack of data from MENA nations. This issue is related to the inaccessibility of data for many MENA countries, particularly regarding the returns on resources, private malfeasance, and other variables in Gulf countries. In addition, the researcher encountered several restrictions, such as electricity and internet outages, due to the fact that he is from Lebanon, a country whose citizens have endured difficult living conditions since the Lebanese crisis began in 2019. Demonstrating a customized index known as Customized Corruption Index-CCI that suits the characteristics of MENA countries to peculiarly measure corruption in this region, the outcome of the Customized Corruption Index-CCI is then compared to the Corruption Perception Index-CPI and Control of Corruption from World Governance Indicator-CC from WGI.

Keywords: corruption, economic growth, corruption measurements, empirical review, impact of corruption

Procedia PDF Downloads 61
22381 Evaluation of Illegal Hunting of Red Deer and Conservation Policy of Department of Environment in Iran

Authors: Tahere Fazilat

Abstract:

Caspian red deer or maral (Cervus elaphus maral) is the largest type of deer in iran. Maral in the past has lived in the north forests of Iran from the Caspian sea coast, Alborz mountains chain and oak forest of Zagros margin from the Azarbaijan up to fars province. However, the generation of them was completely destroyed in the north west and west of Iran. According to reports about 50 years and out of reach of humans. In the present studies, data were collected from 2004 to 2014 in the Mazandaran state Hyrcanian forest by means of guard of environment and justiciary office of department of environment of Mazandaran in this process the all arrested illegal hunting of red deer and the population census, estimation and the correlation of these data was assayed. We provide a first evaluation of how suitable these methods are by comparing the results with population estimates obtained using cohort analysis, and by analyzing the within-season variation in number of seen deer. The data gave us the future of red deer in northern forest of Iran and the results of policy of department of environment in Iran in red deer conservation.

Keywords: illegal hunting, red deer, census, concervation

Procedia PDF Downloads 538
22380 Research on Straightening Process Model Based on Iteration and Self-Learning

Authors: Hong Lu, Xiong Xiao

Abstract:

Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.

Keywords: straightness, straightening stroke, deflection, shaft parts

Procedia PDF Downloads 315
22379 Effects of Elastic, Plyometric and Strength Training on Selected Anaerobic Factors in Sanandaj Elite Volleyball Players

Authors: Majed Zobairy, Fardin Kalvandi, Kamal Azizbaigi

Abstract:

This research was carried out for evaluation of elastic, plyometric and resistance training on selected anaerobic factors in men volleyball players. For these reason 30 elite volleyball players of Sanandaj city randomly divided into 3 groups as follow: elastic training, plyometric training and resistance training. Pre-exercise tests which include vertical jumping, 50 yard speed running and scat test were done and data were recorded. Specific exercise protocol regimen was done for each group and then post-exercise tests again were done. Data analysis showed that there were significant increases in exercise test in each group. One way ANOVA analysis showed that increases in speed records in elastic group were significantly higher than the other groups (p<0/05),based on research data it seems that elastic training can be a useful method and new approach in improving functional test and training regimen.

Keywords: elastic training, plyometric training, strength training, anaerobic power

Procedia PDF Downloads 509
22378 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units

Authors: Mostafa Kazemi, Zahra N. Farkhani

Abstract:

This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.

Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)

Procedia PDF Downloads 546
22377 Circular Polarized and Surface Compatible Microstrip Array Antenna Design for Image and Telemetric Data Transfer in UAV and Armed UAV Systems

Authors: Kübra Taşkıran, Bahattin Türetken

Abstract:

In this paper, a microstrip array antenna with circular polarization at 2.4 GHz frequency has been designed using the in order to provide image and telemetric data transmission in Unmanned Aerial Vehicle and Armed Unmanned Aerial Vehicle Systems. In addition to the antenna design, the power divider design was made and the antennas were fed in phase. As a result of the analysis, it was observed that the antenna operates at a frequency of 2.4016 GHz with 12.2 dBi directing gain. In addition, this designed array antenna was transformed into a form compatible with the rocket surface used in A-UAV Systems, and analyzes were made. As a result of these analyzes, it has been observed that the antenna operates on the surface of the missile at a frequency of 2.372 GHz with a directivity gain of 10.2 dBi.

Keywords: cicrostrip array antenna, circular polarization, 2.4 GHz, image and telemetric data, transmission, surface compatible, UAV and armed UAV

Procedia PDF Downloads 82
22376 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 82
22375 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution

Procedia PDF Downloads 252