Search results for: gender specific data
24825 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting
Authors: Yiannis G. Smirlis
Abstract:
The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.Keywords: data envelopment analysis, interval DEA, efficiency classification, efficiency prediction
Procedia PDF Downloads 16924824 Relations between the Internal Employment Conditions of International Organizations and the Characteristics of the National Civil Service
Authors: Renata Hrecska
Abstract:
This research seeks to fully examine the internal employment law of international organizations by comparing it with the characteristics of the national civil service. The aim of the research is to compare the legal system that has developed over many centuries and the relatively new internal staffing regulations to find out what solution schemes can help each other through mutual legal development in order to respond effectively to the social challenges of everyday life. Generally, the rules of civil service of any country or international entity have in common that they have, in their pragmatics inherently, the characteristic that makes them serving public interests. Though behind the common base there are many differences: there is the clear fragmentation of state regulation and the unity of organizational regulation. On the other hand, however, this difference disappears to some extent: the public service regulation of international organizations can be considered uniform until we examine it within, but not outside an organization. As soon as we compare the different organizations we may find many different solutions for staffing regulations. It is clear that the national civil service is a strong model for international organizations, but the question may be whether the staffing policy of international organizations can serve the national civil service as an example, too. In this respect, the easiest way to imagine a legislative environment would be to have a single comprehensive code, the general part of which is the Civil Service Act itself, and the specific part containing specific, necessarily differentiating rules for each layer of the civil service. Would it be advantageous to follow the footsteps of the leading international organizations, or is there any speciality in national level civil service that we cannot avoid during regulating processes? In addition to the above, the personal competencies of officials working in international organizations and public administrations also show a high degree of similarity, regardless of the type of employment. Thus, the whole public service system is characterized by the fundamental and special values that a person capable of holding a public office must be able to demonstrate, in some cases, even without special qualifications. It is also interesting how we can compare the two spheres of employment in light of the theory of Lawyer Louis Brandeis, a judge at the US Supreme Court, who formulated a complex theory of profession as distinguished from other occupations. From this point of view we can examine the continuous development of research and specialized knowledge at work; the community recognition and social status; that to what extent we can see a close-knit professional organization of altruistic philosophy; that how stability grows in the working conditions due to the stability of the profession; and that how the autonomy of the profession can prevail.Keywords: civil service, comparative law, international organizations, regulatory systems
Procedia PDF Downloads 13824823 Mapping of Traffic Noise in Riyadh City-Saudi Arabia
Authors: Khaled A. Alsaif, Mosaad A. Foda
Abstract:
The present work aims at development of traffic noise maps for Riyadh City using the software Lima. Road traffic data were estimated or measured as accurate as possible in order to obtain consistent noise maps. The predicted noise levels at some selected sites are validated by actual field measurements, which are obtained by a system that consists of a sound level meter, a GPS receiver and a database to manage the measured data. The maps show that noise levels remain over 50 dBA and can exceed 70 dBA at the nearside of major roads and highways.Keywords: noise pollution, road traffic noise, LimA predictor, GPS
Procedia PDF Downloads 39024822 The Introduction of a Tourniquet Checklist to Identify and Record Tourniquet Related Complications
Authors: Akash Soogumbur
Abstract:
Tourniquets are commonly used in orthopaedic surgery to provide hemostasis during procedures on the upper and lower limbs. However, there is a risk of complications associated with tourniquet use, such as nerve damage, skin necrosis, and compartment syndrome. The British Orthopaedic Association (BOAST) guidelines recommend the use of tourniquets at a pressure of 300 mmHg or less for a maximum of 2 hours. Research Aim: The aim of this study was to evaluate the effectiveness of a tourniquet checklist in improving compliance with the BOAST guidelines. Methodology: This was a retrospective study of all orthopaedic procedures performed at a single institution over a 12-month period. The study population included patients who had a tourniquet applied during surgery. Data were collected from the patients' medical records, including the duration of tourniquet use, the pressure used, and the method of exsanguination. Findings: The results showed that the use of the tourniquet checklist significantly improved compliance with the BOAST guidelines. Prior to the introduction of the checklist, compliance with the guidelines was 83% for the duration of tourniquet use and 73% for pressure used. After the introduction of the checklist, compliance increased to 100% for both duration of tourniquet use and pressure used. Theoretical Importance: The findings of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use. Data Collection: Data were collected from the patients' medical records. The data included the following information: Patient demographics, procedure performed, duration of tourniquet use, pressure used, method of exsanguination. Analysis Procedures: The data were analyzed using descriptive statistics. The compliance with the BOAST guidelines was calculated as the percentage of patients who met the guidelines for the duration of tourniquet use and pressure used. Question Addressed: The question addressed by this study was whether the use of a tourniquet checklist could improve compliance with the BOAST guidelines. Conclusion: The results of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use.Keywords: tourniquet, pressure, duration, complications, surgery
Procedia PDF Downloads 7324821 The Effect of Styrene-Butadiene-Rubber (SBR) Polymer Modifier on Properties of Bitumen
Authors: Seyed Abbas Tabatabaei, Alireza Kiasat, Ferdows Karimi Alkouhi
Abstract:
In order to use bitumen in hot mix asphalt, it must have specific characteristics. There are some methods to reach these properties. Using polymer modifiers are one of the methods to modify the bitumen properties. In this paper, the effect of Styrene-Butadiene-Rubber that is one of the bitumen polymer modifiers on rheology properties of bitumen is studied. In this regard, the rheological properties of base bitumen and the modified bitumen with 3, 4, and 5 percent of Styrene-Butadiene-Rubber (SBR) were analysed. The results show that bitumen modified with 5 percent of SBR has the best performance than the other samples.Keywords: bitumen, polymer modifier, styrene-butadiene-rubber, rheological properties
Procedia PDF Downloads 34124820 Data Analysis for Taxonomy Prediction and Annotation of 16S rRNA Gene Sequences from Metagenome Data
Authors: Suchithra V., Shreedhanya, Kavya Menon, Vidya Niranjan
Abstract:
Skin metagenomics has a wide range of applications with direct relevance to the health of the organism. It gives us insight to the diverse community of microorganisms (the microbiome) harbored on the skin. In the recent years, it has become increasingly apparent that the interaction between skin microbiome and the human body plays a prominent role in immune system development, cancer development, disease pathology, and many other biological implications. Next Generation Sequencing has led to faster and better understanding of environmental organisms and their mutual interactions. This project is studying the human skin microbiome of different individuals having varied skin conditions. Bacterial 16S rRNA data of skin microbiome is downloaded from SRA toolkit provided by NCBI to perform metagenomics analysis. Twelve samples are selected with two controls, and 3 different categories, i.e., sex (male/female), skin type (moist/intermittently moist/sebaceous) and occlusion (occluded/intermittently occluded/exposed). Quality of the data is increased using Cutadapt, and its analysis is done using FastQC. USearch, a tool used to analyze an NGS data, provides a suitable platform to obtain taxonomy classification and abundance of bacteria from the metagenome data. The statistical tool used for analyzing the USearch result is METAGENassist. The results revealed that the top three abundant organisms found were: Prevotella, Corynebacterium, and Anaerococcus. Prevotella is known to be an infectious bacterium found on wound, tooth cavity, etc. Corynebacterium and Anaerococcus are opportunist bacteria responsible for skin odor. This result infers that Prevotella thrives easily in sebaceous skin conditions. Therefore it is better to undergo intermittently occluded treatment such as applying ointments, creams, etc. to treat wound for sebaceous skin type. Exposing the wound should be avoided as it leads to an increase in Prevotella abundance. Moist skin type individuals can opt for occluded or intermittently occluded treatment as they have shown to decrease the abundance of bacteria during treatment.Keywords: bacterial 16S rRNA , next generation sequencing, skin metagenomics, skin microbiome, taxonomy
Procedia PDF Downloads 17624819 Development of a Predictive Model to Prevent Financial Crisis
Authors: Tengqin Han
Abstract:
Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.Keywords: delinquency, mortgage, model development, model validation
Procedia PDF Downloads 23124818 A Scheme Cooperating with Cryptography to Enhance Security in Satellite Communications
Authors: Chieh-Fu Chang, Wan-Hsin Hsieh
Abstract:
We have proposed a novel scheme— iterative word-extension (IWE) to enhance the cliff effect of Reed-Solomon codes regarding the error performance at a specific Eb/N0. The scheme can be readily extended to block codes and the important properties of IWE are further investigated here. In order to select proper block codes specifying the desired cliff Eb/N0, the associated features of IWE are explored. These properties and features grant IWE ability to enhance security regarding the received Eb/N0 in physical layer so that IWE scheme can cooperate with the traditional presentation layer approach — cryptography, to meet the secure requirements in diverse applications. The features and feasibility of IWE scheme in satellite communication are finally discussed.Keywords: security, IWE, cliff effect, space communications
Procedia PDF Downloads 43024817 Self-Supervised Learning for Hate-Speech Identification
Authors: Shrabani Ghosh
Abstract:
Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.Keywords: attention learning, language model, offensive language detection, self-supervised learning
Procedia PDF Downloads 11124816 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark
Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos
Abstract:
This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark
Procedia PDF Downloads 12324815 The Developing of Teaching Materials Online for Students in Thailand
Authors: Pitimanus Bunlue
Abstract:
The objectives of this study were to identify the unique characteristics of Salaya Old market, Phutthamonthon, Nakhon Pathom and develop the effective video media to promote the homeland awareness among local people and the characteristic features of this community were collectively summarized based on historical data, community observation, and people’s interview. The acquired data were used to develop a media describing prominent features of the community. The quality of the media was later assessed by interviewing local people in the old market in terms of content accuracy, video, and narration qualities, and sense of homeland awareness after watching the video. The result shows a 6-minute video media containing historical data and outstanding features of this community was developed. Based on the interview, the content accuracy was good. The picture quality and the narration were very good. Most people developed a sense of homeland awareness after watching the video also as well.Keywords: audio-visual, creating homeland awareness, Phutthamonthon Nakhon Pathom, research and development
Procedia PDF Downloads 29624814 A Decision Support System for the Detection of Illicit Substance Production Sites
Authors: Krystian Chachula, Robert Nowak
Abstract:
Manufacturing home-made explosives and synthetic drugs is an increasing problem in Europe. To combat that, a data fusion system is proposed for the detection and localization of production sites in urban environments. The data consists of measurements of properties of wastewater performed by various sensors installed in a sewage network. A four-stage fusion strategy allows detecting sources of waste products from known chemical reactions. First, suspicious measurements are used to compute the amount and position of discharged compounds. Then, this information is propagated through the sewage network to account for missing sensors. The next step is clustering and the formation of tracks. Eventually, tracks are used to reconstruct discharge events. Sensor measurements are simulated by a subsystem based on real-world data. In this paper, different discharge scenarios are considered to show how the parameters of used algorithms affect the effectiveness of the proposed system. This research is a part of the SYSTEM project (SYnergy of integrated Sensors and Technologies for urban sEcured environMent).Keywords: continuous monitoring, information fusion and sensors, internet of things, multisensor fusion
Procedia PDF Downloads 12024813 Implementation of CNV-CH Algorithm Using Map-Reduce Approach
Authors: Aishik Deb, Rituparna Sinha
Abstract:
We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing
Procedia PDF Downloads 14124812 Fam111b Gene Dysregulation Contributes to the Malignancy in Fibrosarcoma, Poor Clinical Outcomes in Poiktmp and a Low-cost Method for Its Mutation Screening
Authors: Cenza Rhoda, Falone Sunda, Elvis Kidzeru, Nonhlanhla P. Khumalo, Afolake Arowolo
Abstract:
Introduction: The human FAM111B gene mutations are associated with POIKTMP, a rare multi-organ fibrosing disease. Recent studies also reported the overexpression of FAM111B in specific cancers. However, the role of FAM111B in these pathologies, particularly fibrosarcoma, remains unknown. Materials and Methods: FAM111B RNA expression in some cancer cell lines was assessed in silico and validated in vitro in these cell lines and skin fibroblasts derived from the South African family member affected by POIKTMP with the heterozygous FAM111B gene mutation: NM_198947.4: c.1861T>G (p. Tyr621Asp or Y621D) by qPCR and western blot. The cellular function of FAM111B was also studied in HT1080 using various cell-based functional assays and a simple and cost-effective PCR-RFLP method for genotyping/screening FAM111B gene mutations described. Results: Expression studies showed upregulated FAM111B mRNA and protein in the cancer cells. High FAM111B expression with robust nuclear localization occurred in HT1080. Additionally, expression data and cell-based assays indicated that FAM111B led to the upregulation of cell migration and decreased cell apoptosis and cell proliferation modulation. FAM111B Y621D mutation showed similar effects on cell migration but minimal impact on cell apoptosis. FAM111B mRNA and protein expression were markedly downregulated (p ≤ 0.05) in the patient's skin-derived fibroblasts. Lastly, the PCR-RFLP method successfully genotyped FAM111B Y621D gene mutation. Discussion: FAM111B is a cancer-associated nuclear protein: Its modulation by mutations may enhance cell migration and proliferation and decrease apoptosis, as seen in cancers and POIKTMP/fibrosis, thus representing a viable therapeutic target in these disorders. Furthermore, the PCR-RFLP method could prove a valuable tool for FAM111B mutation validation or screening in resource-constrained laboratories.Keywords: FAM111B, POIKTMP, cancer, fibrosis, PCR-RFLP
Procedia PDF Downloads 12424811 The Impact of Civil Disobedience on Tourist and Local Residents in Cameroon: Case Study the North West Region
Authors: Zita Fomukong Andam
Abstract:
Civil disobedience according to John Rawls (1971) is a public nonviolent and conscientious breach of laws undertaken with the aim of bringing about a change in government laws and policies. Thus individuals who engage themselves in such an act are aware and ready to accept the consequences of their actions. Cameroon more precisely the Northwest and the Southwest region which are the English part are considered as one of the societies facing this act of civil disobedience. It has been a tormenting issue in the country affecting its economy and the tourism sector. This is because these regions known as one of the best touristic sites of the country is not more considered as a destination to be visited by tourist because of its insecurities. Many commercial buildings have been burning down, leaving many young Cameroonians jobless. Education has been hindered, and youths are forced to relocate to nearby cities in order to continue their education. This crisis has created a lot of insecurity throughout the regions thus youths now have one common interest to travel abroad either to seek refuge or to continue their education and even search for jobs. The purpose of this research is to assess the issue of civil disobedience, trying to understand why it is affected only by a specific region in a country while the others are doing fine. A deep research discourse was conducted with randomly selected individuals aging between 15 to 40 years living both in the destination and abroad. Survey questionnaires and interviews were carried out as a method to collect data. The results show that this crisis has impacted the local residents psychologically and has injected a lot of fears into tourists and they are no more willing to visit the destination. In addition, it has brought a negative impact on the county’s economy since tourism is considered as the key sector in a country’s economy. On the other hand, the results showed that many local residents have remained jobless, others have lost family members, and the daily routine life has been affected. Understanding these results, the national government and international bodies might be able to propose possible and efficient solutions in order to attain stability and security in this region.Keywords: civil disobedience, economic impact, local residents, tourist
Procedia PDF Downloads 12024810 Leveraging Remote Sensing Information for Drought Disaster Risk Management
Authors: Israel Ropo Orimoloye, Johanes A. Belle, Olusola Adeyemi, Olusola O. Ololade
Abstract:
With more than 100,000 orbits during the past 20 years, Terra has significantly improved our knowledge of the Earth's climate and its implications on societies and ecosystems of human activity and natural disasters, including drought events. With Terra instrument's performance and the free distribution of its products, this study utilised Terra MOD13Q1 satellite data to assess drought disaster events and its spatiotemporal patterns over the Free State Province of South Africa between 2001 and 2019 for summer, autumn, winter, and spring seasons. The study also used high-resolution downscaled climate change projections under three representative concentration pathways (RCP). Three future periods comprising the short (the 2030s), medium (2040s), and long term (2050s) compared to the current period are analysed to understand the potential magnitude of projected climate change-related drought. The study revealed that the year 2001 and 2016 witnessed extreme drought conditions where the drought index is between 0 and 20% across the entire province during summer, while the year 2003, 2004, 2007, and 2015 observed severe drought conditions across the region with variation from one part to the another. The result shows that from -24.5 to -25.5 latitude, the area witnessed a decrease in precipitation (80 to 120mm) across the time slice and an increase in the latitude -26° to -28° S for summer seasons, which is more prominent in the year 2041 to 2050. This study emphasizes the strong spatio-environmental impacts within the province and highlights the associated factors that characterise high drought stress risk, especially on the environment and ecosystems. This study contributes to a disaster risk framework to identify areas for specific research and adaptation activities on drought disaster risk and for environmental planning in the study area, which is characterised by both rural and urban contexts, to address climate change-related drought impacts.Keywords: remote sensing, drought disaster, climate scenario, assessment
Procedia PDF Downloads 19124809 A Model to Assess Sustainability Using Multi-Criteria Analysis and Geographic Information Systems: A Case Study
Authors: Antonio Boggia, Luisa Paolotti, Gianluca Massei, Lucia Rocchi, Elaine Pace, Maria Attard
Abstract:
The aim of this paper is to present a methodology and a computer model for sustainability assessment based on the integration of Multi-criteria Decision Analysis (MCDA) with a Geographic Information System (GIS). It presents the result of a study for the implementation of a model for measuring sustainability to address the policy actions for the improvement of sustainability at territory level. The aim is to rank areas in order to understand the specific technical and/or financial support that is required to develop sustainable growth. Assessing sustainable development is a multidimensional problem: economic, social and environmental aspects have to be taken into account at the same time. The tool for a multidimensional representation is a proper set of indicators. The set of indicators must be integrated into a model, that is an assessment methodology, to be used for measuring sustainability. The model, developed by the Environmental Laboratory of the University of Perugia, is called GeoUmbriaSUIT. It is a calculation procedure developed as a plugin working in the open-source GIS software QuantumGIS. The multi-criteria method used within GeoUmbriaSUIT is the algorithm TOPSIS (Technique for Order Preference by Similarity to Ideal Design), which defines a ranking based on the distance from the worst point and the closeness to an ideal point, for each of the criteria used. For the sustainability assessment procedure, GeoUmbriaSUIT uses a geographic vector file where the graphic data represent the study area and the single evaluation units within it (the alternatives, e.g. the regions of a country, or the municipalities of a region), while the alphanumeric data (attribute table), describe the environmental, economic and social aspects related to the evaluation units by means of a set of indicators (criteria). The use of the algorithm available in the plugin allows to treat individually the indicators representing the three dimensions of sustainability, and to compute three different indices: environmental index, economic index and social index. The graphic output of the model allows for an integrated assessment of the three dimensions, avoiding aggregation. The presence of separate indices and graphic output make GeoUmbriaSUIT a readable and transparent tool, since it doesn’t produce an aggregate index of sustainability as final result of the calculations, which is often cryptic and difficult to interpret. In addition, it is possible to develop a “back analysis”, able to explain the positions obtained by the alternatives in the ranking, based on the criteria used. The case study presented is an assessment of the level of sustainability in the six regions of Malta, an island state in the middle of the Mediterranean Sea and the southernmost member of the European Union. The results show that the integration of MCDA-GIS is an adequate approach for sustainability assessment. In particular, the implemented model is able to provide easy to understand results. This is a very important condition for a sound decision support tool, since most of the time decision makers are not experts and need understandable output. In addition, the evaluation path is traceable and transparent.Keywords: GIS, multi-criteria analysis, sustainability assessment, sustainable development
Procedia PDF Downloads 29924808 Temperament as a Success Determinant in Formative Assessment
Authors: George Fomunyam Kehdinga
Abstract:
Assessment is a vital part of the educational process, and formative assessment is a way of ensuring that higher education achieves the desired effects. Different factors influence how students perform in assessments in general, and formative assessment in particular and temperament is one of such determining factors. This paper which is a qualitative case study of four universities in four different countries examines how the temperamental make up of students either empowers them to perform excellently in formative assessment or incapacitates their performance. These four universities were chosen from Cameroon, South Africa, United Kingdom and the United States of America and three students were chosen from each institution, six of which were undergraduate student and six postgraduate students. Data in this paper was generated through qualitative interviews and document analyses which was preceded by a temperament test. From the data generated, it was discovered that cholerics who are natural leaders, hence do not struggle to express themselves often perform excellently in formative assessment while sanguines on the other hand who are also extroverts like cholerics perform relatively well. Phlegmatics and melancholics performed averagely and poorly respectively in formative assessment because they are naturally prone to fear and hate such activities because they like keeping to themselves. The paper, therefore, suggest that temperament is a success determinant in formative assessment. It also proposes that lecturers need and understanding of temperaments to be able to fully administer formative assessment in the lecturer room. It also suggests that assessment should be balance in the classroom so that some students because of their temperamental make-up are not naturally disadvantaged while others are performing excellently. Lastly, the paper suggests that since formative assessment is a process of generating data, it should be contextualised or given and individualised approach so as to ensure that trustworthy data is generated.Keywords: temperament, formative assessment, academic success, students
Procedia PDF Downloads 25024807 Effect of Measured and Calculated Static Torque on Instantaneous Torque Profile of Switched Reluctance Motor
Authors: Ali Asghar Memon
Abstract:
The simulation modeling of switched reluctance (SR) machine often relies and uses the three data tables identified as static torque characteristics that include flux linkage characteristics, co energy characteristics and static torque characteristics separately. It has been noticed from the literature that the data of static torque used in the simulation model is often calculated so far the literature is concerned. This paper presents the simulation model that include the data of measured and calculated static torque separately to see its effect on instantaneous torque profile of the machine. This is probably for the first time so far the literature review is concerned that static torque from co energy information, and measured static torque directly from experiments are separately used in the model. This research is helpful for accurate modeling of switched reluctance drive.Keywords: static characteristics, current chopping, flux linkage characteristics, switched reluctance motor
Procedia PDF Downloads 29424806 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 22224805 Digitally Mapping Aboriginal Journey Ways
Authors: Paul Longley Arthur
Abstract:
This paper reports on an Australian Research Council-funded project utilising the Australian digital research infrastructure the ‘Time-Layered Cultural Map of Australia’ (TLCMap) (https://www.tlcmap.org/) [1]. This resource has been developed to help researchers create digital maps from cultural, textual, and historical data, layered with datasets registered on the platform. TLCMap is a set of online tools that allows humanities researchers to compile humanities data using spatio-temporal coordinates – to upload, gather, analyse and visualise data. It is the only purpose-designed, Australian-developed research tool for humanities and social science researchers to identify geographical clusters and parallel journeys by sight. This presentation discusses a series of Aboriginal mapping and visualisation experiments using TLCMap to show how Indigenous knowledge can reconfigure contemporary understandings of space including the urbanised landscape [2, 3]. The research data being generated – investigating the historical movements of Aboriginal people, the distribution of networks, and their relation to land – lends itself to mapping and geo-spatial visualisation and analysis. TLCMap allows researchers to create layers on a 3D map which pinpoint locations with accompanying information, and this has enabled our research team to plot out traditional historical journeys undertaken by Aboriginal people as well as to compile a gazetteer of Aboriginal place names, many of which have largely been undocumented until now [4]. The documented journeys intersect with and overlay many of today’s urban formations including main roads, municipal boundaries, and state borders. The paper questions how such data can be incorporated into a more culturally and ethically responsive understanding of contemporary urban spaces and as well as natural environments [5].Keywords: spatio-temporal mapping, visualisation, Indigenous knowledge, mobility and migration, research infrastructure
Procedia PDF Downloads 2424804 Using Lesson-Based Discussion to Improve Teaching Quality: A Case of Chinese Mathematics Teachers
Authors: Jian Wang
Abstract:
Teachers’ lesson-based discussions presume central to their effective learning to teach. Whether and to what extent such discussions offer opportunities for teachers to learn to teach effectively is worth a careful empirical examination. This study examines this assumption by drawing on lesson-based discussions and relevant curriculum materials from Chinese teachers in three urban schools. Their lesson-based discussions consistently focused on pedagogical content knowledge and offered specific and reasoned suggestions for teachers to refine their teaching practices. The mandated curriculum and their working language-mediated their lesson-based discussions.Keywords: Chinese teachers, curriculum materials, lesson discussion, mathematics instruction
Procedia PDF Downloads 8324803 The Causality between Corruption and Economic Growth in MENA Countries: A Dynamic Panel-Data Analysis
Authors: Nour Mohamad Fayad
Abstract:
Complex and extensively researched, the impact of corruption on economic growth seems to be intricate. Many experts believe that corruption reduces economic development. However, counterarguments have suggested that corruption either promotes growth and development or has no significant impact on economic performance. Clearly, there is no consensus in the economics literature regarding the possible relationship between corruption and economic development. Corruption's complex and clandestine nature, which makes it difficult to define and measure, is one of the obstacles that must be overcome when investigating its effect on an economy. In an attempt to contribute to the ongoing debate, this study examines the impact of corruption on economic growth in the Middle East and North Africa (MENA) region between 2000 and 2021 using a Customized Corruption Index-CCI and panel data on MENA countries. These countries were selected because they are understudied in the economic literature, and despite the World Bank's recent emphasis on corruption in the developing world, the MENA countries have received little attention. The researcher used Cobb-Douglas functional form to test corruption in MENA using a customized index known as Customized Corruption Index-CCI to track corruption over almost 20 years, then used the dynamic panel data. The findings indicate that there is a positive correlation between corruption and economic growth, but this is not consistent across all MENA nations. First, the relatively recent lack of data from MENA nations. This issue is related to the inaccessibility of data for many MENA countries, particularly regarding the returns on resources, private malfeasance, and other variables in Gulf countries. In addition, the researcher encountered several restrictions, such as electricity and internet outages, due to the fact that he is from Lebanon, a country whose citizens have endured difficult living conditions since the Lebanese crisis began in 2019. Demonstrating a customized index known as Customized Corruption Index-CCI that suits the characteristics of MENA countries to peculiarly measure corruption in this region, the outcome of the Customized Corruption Index-CCI is then compared to the Corruption Perception Index-CPI and Control of Corruption from World Governance Indicator-CC from WGI.Keywords: corruption, economic growth, corruption measurements, empirical review, impact of corruption
Procedia PDF Downloads 7624802 Geometric Morphometric Analysis of Allometric Variation in the Hand Morphology of Adults
Authors: Aleksandr S. Ermolenko
Abstract:
Allometry is an important factor of morphological integration, contributing to the organization of the phenotype and its variability. The allometric change in the shape of the hand is particularly important in primate evolution, as the hand has important taxonomic features. Some of these features are known to parts with the shape, especially the ratio of the lengths of the index and ring fingers (2d: 4d ratio). The hand is a fairly well-studied system in the context of the evolutionary development of complex morphological structures since it consists of various departments (basipodium, metapodium, acropodium) that form a single structure –autopodium. In the present study, we examined the allometric variability of acropodium. We tested the null hypothesis that there would be no difference in allometric variation between the two components. Geometric morphometry based on a procrustation of 16 two-dimensional (2D) landmarks was analyzed using multivariate shape-by-size regressions in samples from 100 people (50 men and 50 women). The results obtained show that men have significantly greater allometric variability for the ring finger (variability in the transverse axis prevails), while women have significantly greater allometric variability for the index finger (variability in the longitudinal axis prevails). The influence of the middle finger on the shape of the hand is typical for both men and women. The influence of the little finger on the shape of the hand, regardless of gender, was not revealed. The results of this study support the hypothesis that allometry contributes to the organization of variation in the human hand.Keywords: human hand, size and shape, 2d:4d ratio, geometric morphometry
Procedia PDF Downloads 16324801 Evaluation of Illegal Hunting of Red Deer and Conservation Policy of Department of Environment in Iran
Authors: Tahere Fazilat
Abstract:
Caspian red deer or maral (Cervus elaphus maral) is the largest type of deer in iran. Maral in the past has lived in the north forests of Iran from the Caspian sea coast, Alborz mountains chain and oak forest of Zagros margin from the Azarbaijan up to fars province. However, the generation of them was completely destroyed in the north west and west of Iran. According to reports about 50 years and out of reach of humans. In the present studies, data were collected from 2004 to 2014 in the Mazandaran state Hyrcanian forest by means of guard of environment and justiciary office of department of environment of Mazandaran in this process the all arrested illegal hunting of red deer and the population census, estimation and the correlation of these data was assayed. We provide a first evaluation of how suitable these methods are by comparing the results with population estimates obtained using cohort analysis, and by analyzing the within-season variation in number of seen deer. The data gave us the future of red deer in northern forest of Iran and the results of policy of department of environment in Iran in red deer conservation.Keywords: illegal hunting, red deer, census, concervation
Procedia PDF Downloads 55524800 Bacterial Diversity Reports Contamination around the Ichkeul Lake in Tunisia
Authors: Zeina Bourhane, Anders Lanzen, Christine Cagnon, Olfa Ben Said, Cristiana Cravo-Laureau, Robert Duran
Abstract:
The anthropogenic pressure in coastal areas increases dramatically with the exploitation of environmental resources. Biomonitoring coastal areas are crucial to determine the impact of pollutants on bacterial communities in soils and sediments since they provide important ecosystem services. However, relevant biomonitoring tools allowing fast determination of the ecological status are yet to be defined. Microbial ecology approaches provide useful information for developing such microbial monitoring tools reporting on the effect of environmental stressors. Chemical and microbial molecular approaches were combined in order to determine microbial bioindicators for assessing the ecological status of soil and river ecosystems around the Ichkeul Lake (Tunisia), an area highly impacted by human activities. Samples were collected along soil/river/lake continuums in three stations around the Ichkeul Lake influenced by different human activities at two seasons (summer and winter). Contaminant pressure indexes (PI), including PAHs (Polycyclic aromatic hydrocarbons), alkanes, and OCPs (Organochlorine pesticides) contents, showed significant differences in the contamination level between the stations with seasonal variation. Bacterial communities were characterized by 16S ribosomal RNAs (rRNA) gene metabarcoding. Although microgAMBI indexes, determined from the sequencing data, were in accordance with contaminant contents, they were not sufficient to fully explain the PI. Therefore, further microbial indicators are still to be defined. The comparison of bacterial communities revealed the specific microbial assemblage for soil, river, and lake sediments, which were significantly correlated with contaminant contents and PI. Such observation offers the possibility to define a relevant set of bioindicators for reporting the effects of human activities on the microbial community structure. Such bioindicators might constitute useful monitoring tools for the management of microbial communities in coastal areas.Keywords: bacterial communities, biomonitoring, contamination, human impacts, microbial bioindicators
Procedia PDF Downloads 16924799 Research on Straightening Process Model Based on Iteration and Self-Learning
Authors: Hong Lu, Xiong Xiao
Abstract:
Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.Keywords: straightness, straightening stroke, deflection, shaft parts
Procedia PDF Downloads 33024798 Semi-Empirical Modeling of Heat Inactivation of Enterococci and Clostridia During the Hygienisation in Anaerobic Digestion Process
Authors: Jihane Saad, Thomas Lendormi, Caroline Le Marechal, Anne-marie Pourcher, Céline Druilhe, Jean-louis Lanoiselle
Abstract:
Agricultural anaerobic digestion consists in the conversion of animal slurry and manure into biogas and digestate. They need, however, to be treated at 70 ºC during 60 min before anaerobic digestion according to the European regulation (EC n°1069/2009 & EU n°142/2011). The impact of such heat treatment on the outcome of bacteria has been poorly studied up to now. Moreover, a recent study¹ has shown that enterococci and clostridia are still detected despite the application of such thermal treatment, questioning the relevance of this approach for the hygienisation of digestate. The aim of this study is to establish the heat inactivation kinetics of two species of enterococci (Enterococcus faecalis and Enterococcus faecium) and two species of clostridia (Clostridioides difficile and Clostridium novyi as a non-toxic model for Clostridium botulinum of group III). A pure culture of each strain was prepared in a specific sterile medium at concentration of 10⁴ – 10⁷ MPN / mL (Most Probable number), depending on the bacterial species. Bacterial suspensions were then filled in sterilized capillary tubes and placed in a water or oil bath at desired temperature for a specific period of time. Each bacterial suspension was enumerated using a MPN approach, and tests were repeated three times for each temperature/time couple. The inactivation kinetics of the four indicator bacteria is described using the Weibull model and the classical Bigelow model of first-order kinetics. The Weibull model takes biological variation, with respect to thermal inactivation, into account and is basically a statistical model of distribution of inactivation times as the classical first-order approach is a special case of the Weibull model. The heat treatment at 70 ºC / 60 min contributes to a reduction greater than 5 log10 for E. faecium and E. faecalis. However, it results only in a reduction of about 0.7 log10 for C. difficile and an increase of 0.5 log10 for C. novyi. Application of treatments at higher temperatures is required to reach a reduction greater or equal to 3 log10 for C. novyi (such as 30 min / 100 ºC, 13 min / 105 ºC, 3 min / 110 ºC, and 1 min / 115 ºC), raising the question of the relevance of the application of heat treatment at 70 ºC / 60 min for these spore-forming bacteria. To conclude, the heat treatment (70 ºC / 60 min) defined by the European regulation is sufficient to inactivate non-sporulating bacteria. Higher temperatures (> 100 ºC) are required as far as spore-forming bacteria concerns to reach a 3 log10 reduction (sporicidal activity).Keywords: heat treatment, enterococci, clostridia, inactivation kinetics
Procedia PDF Downloads 11724797 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units
Authors: Mostafa Kazemi, Zahra N. Farkhani
Abstract:
This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)
Procedia PDF Downloads 57424796 Design and Modeling of Human Middle Ear for Harmonic Response Analysis
Authors: Shende Suraj Balu, A. B. Deoghare, K. M. Pandey
Abstract:
The human middle ear (ME) is a delicate and vital organ. It has a complex structure that performs various functions such as receiving sound pressure and producing vibrations of eardrum and propagating it to inner ear. It consists of Tympanic Membrane (TM), three auditory ossicles, various ligament structures and muscles. Incidents such as traumata, infections, ossification of ossicular structures and other pathologies may damage the ME organs. The conditions can be surgically treated by employing prosthesis. However, the suitability of the prosthesis needs to be examined in advance prior to the surgery. Few decades ago, this issue was addressed and analyzed by developing an equivalent representation either in the form of spring mass system, electrical system using R-L-C circuit or developing an approximated CAD model. But, nowadays a three-dimensional ME model can be constructed using micro X-Ray Computed Tomography (μCT) scan data. Moreover, the concern about patient specific integrity pertaining to the disease can be examined well in advance. The current research work emphasizes to develop the ME model from the stacks of μCT images which are used as input file to MIMICS Research 19.0 (Materialise Interactive Medical Image Control System) software. A stack of CT images is converted into geometrical surface model to build accurate morphology of ME. The work is further extended to understand the dynamic behaviour of Harmonic response of the stapes footplate and umbo for different sound pressure levels applied at lateral side of eardrum using finite element approach. The pathological condition Cholesteatoma of ME is investigated to obtain peak to peak displacement of stapes footplate and umbo. Apart from this condition, other pathologies, mainly, changes in the stiffness of stapedial ligament, TM thickness and ossicular chain separation and fixation are also explored. The developed model of ME for pathologies is validated by comparing the results available in the literatures and also with the results of a normal ME to calculate the percentage loss in hearing capability.Keywords: computed tomography (μCT), human middle ear (ME), harmonic response, pathologies, tympanic membrane (TM)
Procedia PDF Downloads 179