Search results for: consumer data right
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25920

Search results for: consumer data right

22740 A Grounded Theory on Marist Spirituality/Charism from the Perspective of the Lay Marists in the Philippines

Authors: Nino M. Pizarro

Abstract:

To the author’s knowledge, despite the written documents about Marist spirituality/charism, nothing has been done concerning a clear theoretical framework that highlights Marist spirituality/charism from the perspective or lived experience of the lay Marists of St. Marcellin Champagnat. The participants of the study are the lay Marist - educators who are from Marist Schools in the Philippines. Since the study would like to find out the respondents’ own concepts and meanings about Marist spirituality/charism, qualitative methodology is considered the approach to be used in the study. In particular, the study will use the qualitative methods of Barney Glaser. The theory will be generated systematically from data collection, coding and analyzing through memoing, theoretical sampling, sorting and writing and using the constant comparative method. The data collection method that will be employed in this grounded theory research is the in-depth interview that is semi-structured and participant driven. Data collection will be done through snowball sampling that is purposive. The study is considering to come up with a theoretical framework that will help the lay Marists to deepen their understanding of the Marist spirituality/charism and their vocation as lay partners of the Marist Brothers of the Schools.

Keywords: grounded theory, Lay Marists, lived experience, Marist spirituality/charism

Procedia PDF Downloads 311
22739 Active Packaging Films Based on Chitosan Incorporated with Thyme Essential Oil and Cross Linkers and Its Effect on the Quality Shelf Life of Food

Authors: Aiman Zehra, Sajad Mohd Wani

Abstract:

Packaging has a vital role as it contains and protects the food that moves from the supply chain to the consumer. Chitosan (CH) has been extensively used in food packaging applications among the plentiful natural macromolecules, including all the polysaccharide class, owing to its easy film-forming capacity, biodegradability, better oxygen and water vapour barrier ability and good mechanical strength. Compared to synthetic films, the films produced from chitosan present poor barrier and mechanical properties. To overcome its deficient qualities, a number of modification procedures are required to enhance the mechanical and physical properties. Various additives such as plasticizers (e.g., glycerol and sorbitol), crosslinkers (e.g.,CaCl₂, ZnO), fillers (nanoclay), and antimicrobial agents (e.g. thyme essential oil) have been used to improve the mechanical, thermal, morphological, antimicrobial properties and emulsifying agents for the stability and elasticity of chitosan-based biodegradable films. Different novel biocomposite films based on chitosan incorporated with thyme essential oil and different additives (ZnO, CaCl₂, NC, and PEG) were successfully prepared and used as packaging material for carrot candy. The chitosan film incorporated with crosslinkers was capable of forming a protective barrier on the surface of the candy to maintain moisture content, water activity, TSS, total sugars, and titratable acidity. ZnO +PEG +NC +CaCl₂ remarkably promotes a synergistic effect on the barrier properties of the film. The combined use of ZnO +PEG +NC +CaCl₂ in CH-TO films was more effective in preventing the moisture gain in candies. The lowest a𝓌 (0.624) was also observed for the candies stored in treatment. The color values L*, a*, b* of the candies were also retained in the film containing all the additives during the 6th month of storage. The value for L*, a*, and b* observed for T was 42.72, 9.89, and 10.84, respectively. The candies packaged in film retained TSS and acidity. The packaging film significantly p≤0.05 conserved sensory qualities and inhibited microbial activity during storage. Carrot candy was found microbiologically safe for human consumption even after six months of storage in all the packaging materials.

Keywords: chitosan, biodegradable films, antimicrobial activity, thyme essential oil, crosslinkers

Procedia PDF Downloads 95
22738 Annexing the Strength of Information and Communication Technology (ICT) for Real-time TB Reporting Using TB Situation Room (TSR) in Nigeria: Kano State Experience

Authors: Ibrahim Umar, Ashiru Rajab, Sumayya Chindo, Emmanuel Olashore

Abstract:

INTRODUCTION: Kano is the most populous state in Nigeria and one of the two states with the highest TB burden in the country. The state notifies an average of 8,000+ TB cases quarterly and has the highest yearly notification of all the states in Nigeria from 2020 to 2022. The contribution of the state TB program to the National TB notification varies from 9% to 10% quarterly between the first quarter of 2022 and second quarter of 2023. The Kano State TB Situation Room is an innovative platform for timely data collection, collation and analysis for informed decision in health system. During the 2023 second National TB Testing week (NTBTW) Kano TB program aimed at early TB detection, prevention and treatment. The state TB Situation room provided avenue to the state for coordination and surveillance through real time data reporting, review, analysis and use during the NTBTW. OBJECTIVES: To assess the role of innovative information and communication technology platform for real-time TB reporting during second National TB Testing week in Nigeria 2023. To showcase the NTBTW data cascade analysis using TSR as innovative ICT platform. METHODOLOGY: The State TB deployed a real-time virtual dashboard for NTBTW reporting, analysis and feedback. A data room team was set up who received realtime data using google link. Data received was analyzed using power BI analytic tool with statistical alpha level of significance of <0.05. RESULTS: At the end of the week-long activity and using the real-time dashboard with onsite mentorship of the field workers, the state TB program was able to screen a total of 52,054 people were screened for TB from 72,112 individuals eligible for screening (72% screening rate). A total of 9,910 presumptive TB clients were identified and evaluated for TB leading to diagnosis of 445 TB patients with TB (5% yield from presumptives) and placement of 435 TB patients on treatment (98% percentage enrolment). CONCLUSION: The TB Situation Room (TBSR) has been a great asset to Kano State TB Control Program in meeting up with the growing demand for timely data reporting in TB and other global health responses. The use of real time surveillance data during the 2023 NTBTW has in no small measure improved the TB response and feedback in Kano State. Scaling up this intervention to other disease areas, states and nations is a positive step in the right direction towards global TB eradication.

Keywords: tuberculosis (tb), national tb testing week (ntbtw), tb situation rom (tsr), information communication technology (ict)

Procedia PDF Downloads 71
22737 Comparative Analysis of Polish Traditional Bread and Teff Injera: Culinary Heritage and Nutritional Perspectives

Authors: Temesgen Minase Woldegebriel

Abstract:

This study undertakes a comparative analysis of two distinct staples from diverse culinary heritages: Polish traditional bread and Teff Injera. Despite originating from disparate cultural contexts, both these foods hold significant roles in their respective societies, serving as dietary staples rich in cultural symbolism and nutritional value. Our investigation delves into the historical, cultural, and nutritional dimensions of Polish bread and Teff Injera, shedding light on their ingredients, preparation methods, and consumption patterns. Firstly, we explore the rich history and cultural significance embedded within Polish traditional bread, tracing its evolution through centuries of tradition and craftsmanship. From the ubiquitous Polish Rye bread to the intricate regional variations, we unravel the socio-cultural narratives intertwined with each loaf, reflecting Polish identity and culinary heritage. In contrast, our analysis extends to Teff Injera, a staple of Ethiopian and Eritrean cuisine known for its spongy texture and tangy flavor. We delve into the ancient origins of Teff cultivation, highlighting its pivotal role in Ethiopian culture and its symbolic significance in communal dining practices, such as the traditional Ethiopian coffee ceremony. Furthermore, we undertake a comparative examination of the nutritional profiles of Polish bread and Teff Injera, assessing their respective contributions to dietary health and well-being. Through comprehensive nutritional analysis, we elucidate the unique attributes of each staple, considering factors such as gluten content, fiber composition, and micronutrient density. Moreover, our study investigates the contemporary relevance of these traditional staples in the context of shifting dietary preferences and global culinary trends. We analyze consumer perceptions and market dynamics surrounding Polish bread and Teff Injera, discerning patterns of consumption and avenues for innovation in a rapidly evolving food landscape. In conclusion, our comparative analysis illuminates the multifaceted dimensions of Polish traditional bread and Teff Injera, transcending mere culinary discourse to encompass broader themes of cultural heritage, nutrition, and gastronomic diversity.

Keywords: bread, culinary, injera, teff

Procedia PDF Downloads 17
22736 Density Measurement of Mixed Refrigerants R32+R1234yf and R125+R290 from 0°C to 100°C and at Pressures up to 10 MPa

Authors: Xiaoci Li, Yonghua Huang, Hui Lin

Abstract:

Optimization of the concentration of components in mixed refrigerants leads to potential improvement of either thermodynamic cycle performance or safety performance of heat pumps and refrigerators. R32+R1234yf and R125+R290 are two promising binary mixed refrigerants for the application of heat pumps working in the cold areas. The p-ρ-T data of these mixtures are one of the fundamental and necessary properties for design and evaluation of the performance of the heat pumps. Although the property data of mixtures can be predicted by the mixing models based on the pure substances incorporated in programs such as the NIST database Refprop, direct property measurement will still be helpful to reveal the true state behaviors and verify the models. Densities of the mixtures of R32+R1234yf an d R125+R290 are measured by an Anton Paar U shape oscillating tube digital densimeter DMA-4500 in the range of temperatures from 0°C to 100 °C and pressures up to 10 MPa. The accuracy of the measurement reaches 0.00005 g/cm³. The experimental data are compared with the predictions by Refprop in the corresponding range of pressure and temperature.

Keywords: mixed refrigerant, density measurement, densimeter, thermodynamic property

Procedia PDF Downloads 296
22735 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: data envelopment analysis, interval DEA, efficiency classification, efficiency prediction

Procedia PDF Downloads 164
22734 Mapping of Traffic Noise in Riyadh City-Saudi Arabia

Authors: Khaled A. Alsaif, Mosaad A. Foda

Abstract:

The present work aims at development of traffic noise maps for Riyadh City using the software Lima. Road traffic data were estimated or measured as accurate as possible in order to obtain consistent noise maps. The predicted noise levels at some selected sites are validated by actual field measurements, which are obtained by a system that consists of a sound level meter, a GPS receiver and a database to manage the measured data. The maps show that noise levels remain over 50 dBA and can exceed 70 dBA at the nearside of major roads and highways.

Keywords: noise pollution, road traffic noise, LimA predictor, GPS

Procedia PDF Downloads 384
22733 The Introduction of a Tourniquet Checklist to Identify and Record Tourniquet Related Complications

Authors: Akash Soogumbur

Abstract:

Tourniquets are commonly used in orthopaedic surgery to provide hemostasis during procedures on the upper and lower limbs. However, there is a risk of complications associated with tourniquet use, such as nerve damage, skin necrosis, and compartment syndrome. The British Orthopaedic Association (BOAST) guidelines recommend the use of tourniquets at a pressure of 300 mmHg or less for a maximum of 2 hours. Research Aim: The aim of this study was to evaluate the effectiveness of a tourniquet checklist in improving compliance with the BOAST guidelines. Methodology: This was a retrospective study of all orthopaedic procedures performed at a single institution over a 12-month period. The study population included patients who had a tourniquet applied during surgery. Data were collected from the patients' medical records, including the duration of tourniquet use, the pressure used, and the method of exsanguination. Findings: The results showed that the use of the tourniquet checklist significantly improved compliance with the BOAST guidelines. Prior to the introduction of the checklist, compliance with the guidelines was 83% for the duration of tourniquet use and 73% for pressure used. After the introduction of the checklist, compliance increased to 100% for both duration of tourniquet use and pressure used. Theoretical Importance: The findings of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use. Data Collection: Data were collected from the patients' medical records. The data included the following information: Patient demographics, procedure performed, duration of tourniquet use, pressure used, method of exsanguination. Analysis Procedures: The data were analyzed using descriptive statistics. The compliance with the BOAST guidelines was calculated as the percentage of patients who met the guidelines for the duration of tourniquet use and pressure used. Question Addressed: The question addressed by this study was whether the use of a tourniquet checklist could improve compliance with the BOAST guidelines. Conclusion: The results of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use.

Keywords: tourniquet, pressure, duration, complications, surgery

Procedia PDF Downloads 71
22732 Data Analysis for Taxonomy Prediction and Annotation of 16S rRNA Gene Sequences from Metagenome Data

Authors: Suchithra V., Shreedhanya, Kavya Menon, Vidya Niranjan

Abstract:

Skin metagenomics has a wide range of applications with direct relevance to the health of the organism. It gives us insight to the diverse community of microorganisms (the microbiome) harbored on the skin. In the recent years, it has become increasingly apparent that the interaction between skin microbiome and the human body plays a prominent role in immune system development, cancer development, disease pathology, and many other biological implications. Next Generation Sequencing has led to faster and better understanding of environmental organisms and their mutual interactions. This project is studying the human skin microbiome of different individuals having varied skin conditions. Bacterial 16S rRNA data of skin microbiome is downloaded from SRA toolkit provided by NCBI to perform metagenomics analysis. Twelve samples are selected with two controls, and 3 different categories, i.e., sex (male/female), skin type (moist/intermittently moist/sebaceous) and occlusion (occluded/intermittently occluded/exposed). Quality of the data is increased using Cutadapt, and its analysis is done using FastQC. USearch, a tool used to analyze an NGS data, provides a suitable platform to obtain taxonomy classification and abundance of bacteria from the metagenome data. The statistical tool used for analyzing the USearch result is METAGENassist. The results revealed that the top three abundant organisms found were: Prevotella, Corynebacterium, and Anaerococcus. Prevotella is known to be an infectious bacterium found on wound, tooth cavity, etc. Corynebacterium and Anaerococcus are opportunist bacteria responsible for skin odor. This result infers that Prevotella thrives easily in sebaceous skin conditions. Therefore it is better to undergo intermittently occluded treatment such as applying ointments, creams, etc. to treat wound for sebaceous skin type. Exposing the wound should be avoided as it leads to an increase in Prevotella abundance. Moist skin type individuals can opt for occluded or intermittently occluded treatment as they have shown to decrease the abundance of bacteria during treatment.

Keywords: bacterial 16S rRNA , next generation sequencing, skin metagenomics, skin microbiome, taxonomy

Procedia PDF Downloads 172
22731 Self-Supervised Learning for Hate-Speech Identification

Authors: Shrabani Ghosh

Abstract:

Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.

Keywords: attention learning, language model, offensive language detection, self-supervised learning

Procedia PDF Downloads 105
22730 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark

Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos

Abstract:

This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.

Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark

Procedia PDF Downloads 120
22729 The Developing of Teaching Materials Online for Students in Thailand

Authors: Pitimanus Bunlue

Abstract:

The objectives of this study were to identify the unique characteristics of Salaya Old market, Phutthamonthon, Nakhon Pathom and develop the effective video media to promote the homeland awareness among local people and the characteristic features of this community were collectively summarized based on historical data, community observation, and people’s interview. The acquired data were used to develop a media describing prominent features of the community. The quality of the media was later assessed by interviewing local people in the old market in terms of content accuracy, video, and narration qualities, and sense of homeland awareness after watching the video. The result shows a 6-minute video media containing historical data and outstanding features of this community was developed. Based on the interview, the content accuracy was good. The picture quality and the narration were very good. Most people developed a sense of homeland awareness after watching the video also as well.

Keywords: audio-visual, creating homeland awareness, Phutthamonthon Nakhon Pathom, research and development

Procedia PDF Downloads 291
22728 A Decision Support System for the Detection of Illicit Substance Production Sites

Authors: Krystian Chachula, Robert Nowak

Abstract:

Manufacturing home-made explosives and synthetic drugs is an increasing problem in Europe. To combat that, a data fusion system is proposed for the detection and localization of production sites in urban environments. The data consists of measurements of properties of wastewater performed by various sensors installed in a sewage network. A four-stage fusion strategy allows detecting sources of waste products from known chemical reactions. First, suspicious measurements are used to compute the amount and position of discharged compounds. Then, this information is propagated through the sewage network to account for missing sensors. The next step is clustering and the formation of tracks. Eventually, tracks are used to reconstruct discharge events. Sensor measurements are simulated by a subsystem based on real-world data. In this paper, different discharge scenarios are considered to show how the parameters of used algorithms affect the effectiveness of the proposed system. This research is a part of the SYSTEM project (SYnergy of integrated Sensors and Technologies for urban sEcured environMent).

Keywords: continuous monitoring, information fusion and sensors, internet of things, multisensor fusion

Procedia PDF Downloads 115
22727 Implementation of CNV-CH Algorithm Using Map-Reduce Approach

Authors: Aishik Deb, Rituparna Sinha

Abstract:

We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.

Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing

Procedia PDF Downloads 136
22726 Inferring Human Mobility in India Using Machine Learning

Authors: Asra Yousuf, Ajaykumar Tannirkulum

Abstract:

Inferring rural-urban migration trends can help design effective policies that promote better urban planning and rural development. In this paper, we describe how machine learning algorithms can be applied to predict internal migration decisions of people. We consider data collected from household surveys in Tamil Nadu to train our model. To measure the performance of the model, we use data on past migration from National Sample Survey Organisation of India. The factors for training the model include socioeconomic characteristic of each individual like age, gender, place of residence, outstanding loans, strength of the household, etc. and his past migration history. We perform a comparative analysis of the performance of a number of machine learning algorithm to determine their prediction accuracy. Our results show that machine learning algorithms provide a stronger prediction accuracy as compared to statistical models. Our goal through this research is to propose the use of data science techniques in understanding human decisions and behaviour in developing countries.

Keywords: development, migration, internal migration, machine learning, prediction

Procedia PDF Downloads 271
22725 The Golden Bridge for Better Farmers Life

Authors: Giga Rahmah An-Nafisah, Lailatus Syifa Kamilah

Abstract:

Agriculture today, especially in Indonesia have globally improved. Since the election of the new president, who in the program of work priority the food self-sufficiency. Many ways and attempts have been planned carefully. All this is done to maximize agricultural production for the future. But if we look from another side, there is something missing. Yes! Improvement of life safety of the farmers, useless we fix all agricultural processing systems to maximize agricultural output, but the Hero of agriculture itself it does not change towards a better life. Yes, broker or middleman system agriculture results. Broker system or middleman this is the real problem facing farmers for their welfare. How come? As much as agriculture result, but if farmers were sell into middlemen with very low prices, then there will be no progress for their welfare. Broker system who do the actual middlemen should not happen in the current agricultural system, because the agriculture condition currently being concern, they would still be able to reap a profit as much as possible, no matter how miserable farmers manage the farm and currently face import competition this cannot be avoided anymore. This phenomenon is already visible plain sight all, who see it. Why? Because farmers those who fell victim cannot do anything to change this system. It is true, if only these middlemen who want to receive it for the sale of agricultural products, or arguably the only system that is the bridge realtor economic life of the farmers. The problem is that we should strive for the welfare of the heroes of our food. A golden bridge that could save them that, are the government. Why? Because the government can more easily with the powers to stop this broker system compared to other parties. The government supposed to be a bridge connecting the farmers with consumers or the people themselves. Yes, with improved broker system becomes: buy agricultural produce with highest prices to farmers and selling of agricultural products with lowest price to the consumer or the people themselves. And then the next question about the fate of middlemen? The system indirectly realtor is like system corruption. Why? Because the definition of corruption is an activity that is detrimental to the victim without being noticed by anyone continue to enrich himself and his victim's life miserable. Government may transfer performance of the middlemen into the idea of a new bridge that is done by the government itself. The government could lift them into this new bridge system employs them to remain a distributor of agricultural products themselves, but under the new policy made by the government to keep improving the welfare of farmers. This idea is made is not going to have much effect would improve the welfare of farmers, but most/least this idea will bring around many people for helping conscience farmers to the government, through the daily chatter, as well as celebrity gossip can quickly know too many people.

Keywords: broker system, farmers live, government, agricultural economics

Procedia PDF Downloads 294
22724 Place and Importance of Goats in the Milk Sector in Algeria

Authors: Tennah Safia, Azzag Naouelle, Derdour Salima, Hafsi Fella, Laouadi Mourad, Laamari Abdalouahab, Ghalmi Farida, Kafidi Nacerredine

Abstract:

Currently, goat farming is widely practiced among the rural population of Algeria. Although milk yield of goats is low (110 liters per goat and per year on average), this milk partly ensures the feeding of small children and provides raw milk, curd, and fermented milk to the whole family. In addition, given its investment cost, which is ten times lower than that of a cow, this level of production is still of interest. This interest is reinforced by the qualities of goat's milk, highly sought after for its nutritional value superior to that of cow's milk. In the same way, its aptitude for the transformation, in particular in quality cheeses, is very sought after. The objective of this study is to give the situation of goat milk production in rural areas of Algeria and to establish a classification of goat breeds according to their production potential. For this, a survey was carried out with goat farmers in Algerian steppe. Three indigenous breeds were encountered in this study: the breed Arabia, Mozabite, and Mekatia; Arabia being the most dominant. The Mekatia breed and the Mozabite breed appear to have higher production and milking abilities than other local breeds. They are therefore indicated to play the role of local dairy breeds par excellence. The other breed that could be improved milk performance is the Arabia breed. There, however, the milk performance of this breed is low. However, in order to increase milk production, uncontrolled crosses with imported breeds (mainly Saanen and Alpine) were carried out. The third population that can be included in the category for dairy production is the dairy breed group of imported origin. There are farms in Algeria composed of Alpine and Saanen breeds born locally. Improved milk performance of local goats, Crusader population, and dairy breeds of imported origin could be done by selection. For this, it is necessary to set up a milk control to detect the best animals. This control could be carried out among interested farmers in each large goat breeding area. In conclusion, sustained efforts must be made to enable the sustainable development of the goat sector in Algeria. It will, therefore, be necessary to deepen the reflection on a national strategy to valorize goat's milk, taking into account the specificities of the environment, the genetic biodiversity, and the eating habits of the Algerian consumer.

Keywords: goat, milk, Algeria, biodiversity

Procedia PDF Downloads 185
22723 Temperament as a Success Determinant in Formative Assessment

Authors: George Fomunyam Kehdinga

Abstract:

Assessment is a vital part of the educational process, and formative assessment is a way of ensuring that higher education achieves the desired effects. Different factors influence how students perform in assessments in general, and formative assessment in particular and temperament is one of such determining factors. This paper which is a qualitative case study of four universities in four different countries examines how the temperamental make up of students either empowers them to perform excellently in formative assessment or incapacitates their performance. These four universities were chosen from Cameroon, South Africa, United Kingdom and the United States of America and three students were chosen from each institution, six of which were undergraduate student and six postgraduate students. Data in this paper was generated through qualitative interviews and document analyses which was preceded by a temperament test. From the data generated, it was discovered that cholerics who are natural leaders, hence do not struggle to express themselves often perform excellently in formative assessment while sanguines on the other hand who are also extroverts like cholerics perform relatively well. Phlegmatics and melancholics performed averagely and poorly respectively in formative assessment because they are naturally prone to fear and hate such activities because they like keeping to themselves. The paper, therefore, suggest that temperament is a success determinant in formative assessment. It also proposes that lecturers need and understanding of temperaments to be able to fully administer formative assessment in the lecturer room. It also suggests that assessment should be balance in the classroom so that some students because of their temperamental make-up are not naturally disadvantaged while others are performing excellently. Lastly, the paper suggests that since formative assessment is a process of generating data, it should be contextualised or given and individualised approach so as to ensure that trustworthy data is generated.

Keywords: temperament, formative assessment, academic success, students

Procedia PDF Downloads 248
22722 Effect of Measured and Calculated Static Torque on Instantaneous Torque Profile of Switched Reluctance Motor

Authors: Ali Asghar Memon

Abstract:

The simulation modeling of switched reluctance (SR) machine often relies and uses the three data tables identified as static torque characteristics that include flux linkage characteristics, co energy characteristics and static torque characteristics separately. It has been noticed from the literature that the data of static torque used in the simulation model is often calculated so far the literature is concerned. This paper presents the simulation model that include the data of measured and calculated static torque separately to see its effect on instantaneous torque profile of the machine. This is probably for the first time so far the literature review is concerned that static torque from co energy information, and measured static torque directly from experiments are separately used in the model. This research is helpful for accurate modeling of switched reluctance drive.

Keywords: static characteristics, current chopping, flux linkage characteristics, switched reluctance motor

Procedia PDF Downloads 292
22721 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 219
22720 Digitally Mapping Aboriginal Journey Ways

Authors: Paul Longley Arthur

Abstract:

This paper reports on an Australian Research Council-funded project utilising the Australian digital research infrastructure the ‘Time-Layered Cultural Map of Australia’ (TLCMap) (https://www.tlcmap.org/) [1]. This resource has been developed to help researchers create digital maps from cultural, textual, and historical data, layered with datasets registered on the platform. TLCMap is a set of online tools that allows humanities researchers to compile humanities data using spatio-temporal coordinates – to upload, gather, analyse and visualise data. It is the only purpose-designed, Australian-developed research tool for humanities and social science researchers to identify geographical clusters and parallel journeys by sight. This presentation discusses a series of Aboriginal mapping and visualisation experiments using TLCMap to show how Indigenous knowledge can reconfigure contemporary understandings of space including the urbanised landscape [2, 3]. The research data being generated – investigating the historical movements of Aboriginal people, the distribution of networks, and their relation to land – lends itself to mapping and geo-spatial visualisation and analysis. TLCMap allows researchers to create layers on a 3D map which pinpoint locations with accompanying information, and this has enabled our research team to plot out traditional historical journeys undertaken by Aboriginal people as well as to compile a gazetteer of Aboriginal place names, many of which have largely been undocumented until now [4]. The documented journeys intersect with and overlay many of today’s urban formations including main roads, municipal boundaries, and state borders. The paper questions how such data can be incorporated into a more culturally and ethically responsive understanding of contemporary urban spaces and as well as natural environments [5].

Keywords: spatio-temporal mapping, visualisation, Indigenous knowledge, mobility and migration, research infrastructure

Procedia PDF Downloads 18
22719 Japanese and Europe Legal Frameworks on Data Protection and Cybersecurity: Asymmetries from a Comparative Perspective

Authors: S. Fantin

Abstract:

This study is the result of the legal research on cybersecurity and data protection within the EUNITY (Cybersecurity and Privacy Dialogue between Europe and Japan) project, aimed at fostering the dialogue between the European Union and Japan. Based on the research undertaken therein, the author offers an outline of the main asymmetries in the laws governing such fields in the two regions. The research is a comparative analysis of the two legal frameworks, taking into account specific provisions, ratio legis and policy initiatives. Recent doctrine was taken into account, too, as well as empirical interviews with EU and Japanese stakeholders and project partners. With respect to the protection of personal data, the European Union has recently reformed its legal framework with a package which includes a regulation (General Data Protection Regulation), and a directive (Directive 680 on personal data processing in the law enforcement domain). In turn, the Japanese law under scrutiny for this study has been the Act on Protection of Personal Information. Based on a comparative analysis, some asymmetries arise. The main ones refer to the definition of personal information and the scope of the two frameworks. Furthermore, the rights of the data subjects are differently articulated in the two regions, while the nature of sanctions take two opposite approaches. Regarding the cybersecurity framework, the situation looks similarly misaligned. Japan’s main text of reference is the Basic Cybersecurity Act, while the European Union has a more fragmented legal structure (to name a few, Network and Information Security Directive, Critical Infrastructure Directive and Directive on the Attacks at Information Systems). On an relevant note, unlike a more industry-oriented European approach, the concept of cyber hygiene seems to be neatly embedded in the Japanese legal framework, with a number of provisions that alleviate operators’ liability by turning such a burden into a set of recommendations to be primarily observed by citizens. With respect to the reasons to fill such normative gaps, these are mostly grounded on three basis. Firstly, the cross-border nature of cybercrime brings to consider both magnitude of the issue and its regulatory stance globally. Secondly, empirical findings from the EUNITY project showed how recent data breaches and cyber-attacks had shared implications between Europe and Japan. Thirdly, the geopolitical context is currently going through the direction of bringing the two regions to significant agreements from a trade standpoint, but also from a data protection perspective (with an imminent signature by both parts of a so-called ‘Adequacy Decision’). The research conducted in this study reveals two asymmetric legal frameworks on cyber security and data protection. With a view to the future challenges presented by the strengthening of the collaboration between the two regions and the trans-national fashion of cybercrime, it is urged that solutions are found to fill in such gaps, in order to allow European Union and Japan to wisely increment their partnership.

Keywords: cybersecurity, data protection, European Union, Japan

Procedia PDF Downloads 123
22718 The Causality between Corruption and Economic Growth in MENA Countries: A Dynamic Panel-Data Analysis

Authors: Nour Mohamad Fayad

Abstract:

Complex and extensively researched, the impact of corruption on economic growth seems to be intricate. Many experts believe that corruption reduces economic development. However, counterarguments have suggested that corruption either promotes growth and development or has no significant impact on economic performance. Clearly, there is no consensus in the economics literature regarding the possible relationship between corruption and economic development. Corruption's complex and clandestine nature, which makes it difficult to define and measure, is one of the obstacles that must be overcome when investigating its effect on an economy. In an attempt to contribute to the ongoing debate, this study examines the impact of corruption on economic growth in the Middle East and North Africa (MENA) region between 2000 and 2021 using a Customized Corruption Index-CCI and panel data on MENA countries. These countries were selected because they are understudied in the economic literature, and despite the World Bank's recent emphasis on corruption in the developing world, the MENA countries have received little attention. The researcher used Cobb-Douglas functional form to test corruption in MENA using a customized index known as Customized Corruption Index-CCI to track corruption over almost 20 years, then used the dynamic panel data. The findings indicate that there is a positive correlation between corruption and economic growth, but this is not consistent across all MENA nations. First, the relatively recent lack of data from MENA nations. This issue is related to the inaccessibility of data for many MENA countries, particularly regarding the returns on resources, private malfeasance, and other variables in Gulf countries. In addition, the researcher encountered several restrictions, such as electricity and internet outages, due to the fact that he is from Lebanon, a country whose citizens have endured difficult living conditions since the Lebanese crisis began in 2019. Demonstrating a customized index known as Customized Corruption Index-CCI that suits the characteristics of MENA countries to peculiarly measure corruption in this region, the outcome of the Customized Corruption Index-CCI is then compared to the Corruption Perception Index-CPI and Control of Corruption from World Governance Indicator-CC from WGI.

Keywords: corruption, economic growth, corruption measurements, empirical review, impact of corruption

Procedia PDF Downloads 74
22717 Evaluation of Illegal Hunting of Red Deer and Conservation Policy of Department of Environment in Iran

Authors: Tahere Fazilat

Abstract:

Caspian red deer or maral (Cervus elaphus maral) is the largest type of deer in iran. Maral in the past has lived in the north forests of Iran from the Caspian sea coast, Alborz mountains chain and oak forest of Zagros margin from the Azarbaijan up to fars province. However, the generation of them was completely destroyed in the north west and west of Iran. According to reports about 50 years and out of reach of humans. In the present studies, data were collected from 2004 to 2014 in the Mazandaran state Hyrcanian forest by means of guard of environment and justiciary office of department of environment of Mazandaran in this process the all arrested illegal hunting of red deer and the population census, estimation and the correlation of these data was assayed. We provide a first evaluation of how suitable these methods are by comparing the results with population estimates obtained using cohort analysis, and by analyzing the within-season variation in number of seen deer. The data gave us the future of red deer in northern forest of Iran and the results of policy of department of environment in Iran in red deer conservation.

Keywords: illegal hunting, red deer, census, concervation

Procedia PDF Downloads 552
22716 Research on Straightening Process Model Based on Iteration and Self-Learning

Authors: Hong Lu, Xiong Xiao

Abstract:

Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.

Keywords: straightness, straightening stroke, deflection, shaft parts

Procedia PDF Downloads 328
22715 Effects of Elastic, Plyometric and Strength Training on Selected Anaerobic Factors in Sanandaj Elite Volleyball Players

Authors: Majed Zobairy, Fardin Kalvandi, Kamal Azizbaigi

Abstract:

This research was carried out for evaluation of elastic, plyometric and resistance training on selected anaerobic factors in men volleyball players. For these reason 30 elite volleyball players of Sanandaj city randomly divided into 3 groups as follow: elastic training, plyometric training and resistance training. Pre-exercise tests which include vertical jumping, 50 yard speed running and scat test were done and data were recorded. Specific exercise protocol regimen was done for each group and then post-exercise tests again were done. Data analysis showed that there were significant increases in exercise test in each group. One way ANOVA analysis showed that increases in speed records in elastic group were significantly higher than the other groups (p<0/05),based on research data it seems that elastic training can be a useful method and new approach in improving functional test and training regimen.

Keywords: elastic training, plyometric training, strength training, anaerobic power

Procedia PDF Downloads 528
22714 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units

Authors: Mostafa Kazemi, Zahra N. Farkhani

Abstract:

This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.

Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)

Procedia PDF Downloads 568
22713 Circular Polarized and Surface Compatible Microstrip Array Antenna Design for Image and Telemetric Data Transfer in UAV and Armed UAV Systems

Authors: Kübra Taşkıran, Bahattin Türetken

Abstract:

In this paper, a microstrip array antenna with circular polarization at 2.4 GHz frequency has been designed using the in order to provide image and telemetric data transmission in Unmanned Aerial Vehicle and Armed Unmanned Aerial Vehicle Systems. In addition to the antenna design, the power divider design was made and the antennas were fed in phase. As a result of the analysis, it was observed that the antenna operates at a frequency of 2.4016 GHz with 12.2 dBi directing gain. In addition, this designed array antenna was transformed into a form compatible with the rocket surface used in A-UAV Systems, and analyzes were made. As a result of these analyzes, it has been observed that the antenna operates on the surface of the missile at a frequency of 2.372 GHz with a directivity gain of 10.2 dBi.

Keywords: cicrostrip array antenna, circular polarization, 2.4 GHz, image and telemetric data, transmission, surface compatible, UAV and armed UAV

Procedia PDF Downloads 103
22712 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 92
22711 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution

Procedia PDF Downloads 261