Search results for: Data Envelopment Analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 41012

Search results for: Data Envelopment Analysis

39362 A Comprehensive Comparative Study on Seasonal Variation of Parameters Involved in Site Characterization and Site Response Analysis by Using Microtremor Data

Authors: Yehya Rasool, Mohit Agrawal

Abstract:

The site characterization and site response analysis are the crucial steps for reliable seismic microzonation of an area. So, the basic parameters involved in these fundamental steps are required to be chosen properly in order to efficiently characterize the vulnerable sites of the study region. In this study, efforts are made to delineate the variations in the physical parameter of the soil for the summer and monsoon seasons of the year (2021) by using Horizontal-to-Vertical Spectral Ratios (HVSRs) recorded at five sites of the Indian Institute of Technology (Indian School of Mines), Dhanbad, Jharkhand, India. The data recording at each site was done in such a way that less amount of anthropogenic noise was recorded at each site. The analysis has been done for five seismic parameters like predominant frequency, H/V ratio, the phase velocity of Rayleigh waves, shear wave velocity (Vs), compressional wave velocity (Vp), and Poisson’s ratio for both the seasons of the year. From the results, it is observed that these parameters majorly vary drastically for the upper layers of soil, which in turn may affect the amplification ratios and probability of exceedance obtained from seismic hazard studies. The HVSR peak comes out to be higher in monsoon, with a shift in predominant frequency as compared to the summer season of the year 2021. Also, the drastic reduction in shear wave velocity (up to ~10 m) of approximately 7%-15% is also perceived during the monsoon period with a slight decrease in compressional wave velocity. Generally, the increase in the Poisson ratios is found to have higher values during monsoon in comparison to the summer period. Our study may be very beneficial to various agricultural and geotechnical engineering projects.

Keywords: HVSR, shear wave velocity profile, Poisson ratio, microtremor data

Procedia PDF Downloads 69
39361 Cfd Simulation for Urban Environment for Evaluation of a Wind Energy Potential of a Building or a New Urban Planning

Authors: David Serero, Loic Couton, Jean-Denis Parisse, Robert Leroy

Abstract:

This paper presents an analysis method of airflow at the periphery of several typologies of architectural volumes. To understand the complexity of the urban environment on the airflows in the city, we compared three sites at different architectural scale. The research sets a method to identify the optimal location for the installation of wind turbines on the edges of a building and to achieve an improvement in the performance of energy extracted by precise localization of an accelerating wing called “aero foil”. The objective is to define principles for the installation of wind turbines and natural ventilation design of buildings. Instead of theoretical winds analysis, we combined numerical aeraulic simulations using STAR CCM + software with wind data, over long periods of time (greater than 1 year). If airflows computer fluid analysis (CFD) simulation of buildings are current, we have calibrated a virtual wind tunnel with wind data using in situ anemometers (to establish localized cartography of urban winds). We can then develop a complete volumetric model of the behavior of the wind on a roof area, or an entire urban island. With this method, we can categorize: - the different types of wind in urban areas and identify the minimum and maximum wind spectrum, - select the type of harvesting devices - fixing to the roof of a building, - the altimetry of the device in relation to the levels of the roofs - The potential nuisances around. This study is carried out from the recovery of a geolocated data flow, and the connection of this information with the technical specifications of wind turbines, their energy performance and their speed of engagement. Thanks to this method, we can thus define the characteristics of wind turbines to maximize their performance in urban sites and in a turbulent airflow regime. We also study the installation of a wind accelerator associated with buildings. The “aerofoils which are integrated are improvement to control the speed of the air, to orientate it on the wind turbine, to accelerate it and to hide, thanks to its profile, the device on the roof of the building.

Keywords: wind energy harvesting, wind turbine selection, urban wind potential analysis, CFD simulation for architectural design

Procedia PDF Downloads 129
39360 The Changes in Motivations and the Use of Translation Strategies in Crowdsourced Translation: A Case Study on Global Voices’ Chinese Translation Project

Authors: Ya-Mei Chen

Abstract:

Online crowdsourced translation, an innovative translation practice brought by Web 2.0 technologies and the democratization of information, has become increasingly popular in the Internet era. Carried out by grass-root internet users, crowdsourced translation contains fundamentally different features from its off-line traditional counterpart, such as voluntary participation and parallel collaboration. To better understand such a participatory and collaborative nature, this paper will use the online Chinese translation project of Global Voices as a case study to investigate the following issues: (1) the changes in volunteer translators’ and reviewers’ motivations for participation, (2) translators’ and reviewers’ use of translation strategies and (3) the correlations of translators’ and reviewers’ motivations and strategies with the organizational mission, the translation style guide, the translator-reviewer interaction, the mediation of the translation platform and various types of capital within the translation field. With an aim to systematically explore the above three issues, this paper will collect both quantitative and qualitative data and then draw upon Engestrom’s activity theory and Bourdieu’s field theory as a theoretical framework to analyze the data in question. An online anonymous questionnaire will be conducted to obtain the quantitative data. The questionnaire will contain questions related to volunteer translators’ and reviewers’ backgrounds, participation motivations, translation strategies and mutual relations as well as the operation of the translation platform. Concerning the qualitative data, they will come from (1) a comparative study between some English news texts published on Global Voices and their Chinese translations, (2) an analysis of the online discussion forum associated with Global Voices’ Chinese translation project and (3) the information about the project’s translation mission and guidelines. It is hoped that this research, through a detailed sociological analysis of a cause-driven crowdsourced translation project, can enable translation researchers and practitioners to adequately meet the translation challenges appearing in the digital age.

Keywords: crowdsourced translation, global voices, motivation, translation strategies

Procedia PDF Downloads 357
39359 In-service High School Teachers’ Experiences On Blended Teaching Approach Of Mathematics

Authors: Lukholo Raxangana

Abstract:

Fourth Industrial Revolution (4IR)-era teaching offers in-service mathematics teachers opportunities to use blended approaches to engage learners while teaching mathematics. This study explores in-service high school teachers' experiences with a blended teaching approach to mathematics. This qualitative case study involved eight pre-service teachers from four selected schools in the Sedibeng West District of the Gauteng Province. The study used the community of inquiry model as its analytical framework for data analysis. Data collection was through semi-structured interviews and focus-group discussions to explore in-service teachers' experiences with the influence of blended teaching (BT) on learning mathematics. The study results are the impact of load-shedding, benefits of BT, and perceptions of in-service and hindrances of BT. Based on these findings, the study recommends that further research should focus on developing data-free BT tools to assist during load-shedding, regardless of location.

Keywords: bended teaching, teachers, in-service, and mathematics

Procedia PDF Downloads 44
39358 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 267
39357 Analysis of Computer Science Papers Conducted by Board of Intermediate and Secondary Education at Secondary Level

Authors: Ameema Mahroof, Muhammad Saeed

Abstract:

The purpose of this study was to analyze the papers of computer science conducted by Board of Intermediate and Secondary Education with reference to Bloom’s taxonomy. The present study has two parts. First, the analysis is done on the papers conducted by Board of Intermediate of Secondary Education on the basis of basic rules of item construction especially Bloom’s (1956). And the item analysis is done to improve the psychometric properties of a test. The sample included the question papers of computer science of higher secondary classes (XI-XII) for the years 2011 and 2012. For item analysis, the data was collected from 60 students through convenient sampling. Findings of the study revealed that in the papers by Board of intermediate and secondary education the maximum focus was on knowledge and understanding level and very less focus was on the application, analysis, and synthesis. Furthermore, the item analysis on the question paper reveals that item difficulty of most of the questions did not show a balanced paper, the items were either very difficult while most of the items were too easy (measuring knowledge and understanding abilities). Likewise, most of the items were not truly discriminating the high and low achievers; four items were even negatively discriminating. The researchers also analyzed the items of the paper through software Conquest. These results show that the papers conducted by Board of Intermediate and Secondary Education were not well constructed. It was recommended that paper setters should be trained in developing the question papers that can measure various cognitive abilities of students so that a good paper in computer science should assess all cognitive abilities of students.

Keywords: Bloom’s taxonomy, question paper, item analysis, cognitive domain, computer science

Procedia PDF Downloads 132
39356 A Comparative Analysis of the Factors Determining Improvement and Effectiveness of Mediation in Family Matters Regarding Child Protection in Australia and Poland

Authors: Beata Anna Bronowicka

Abstract:

Purpose The purpose of this paper is to improve effectiveness of mediation in family matters regarding child protection in Australia and Poland. Design/methodology/approach the methodological approach is phenomenology. Two phenomenological methods of data collection were used in this research 1/ a doctrinal research 2/an interview. The doctrinal research forms the basis for obtaining information on mediation, the date of introduction of this alternative dispute resolution method to the Australian and Polish legal systems. No less important were the analysis of the legislation and legal doctrine in the field of mediation in family matters, especially child protection. In the second method, the data was collected by semi-structured interview. The collected data was translated from Polish to English and analysed using software program. Findings- The rights of children in the context of mediation in Australia and Poland differ from the recommendations of the UN Committee on the Rights of the Child, which require that children be included in all matters that concern them. It is the room for improvement in the mediation process by increasing child rights in mediation between parents in matters related to children. Children should have the right to express their opinion similarly to the case in the court process. The challenge with mediation is also better understanding the role of professionals in mediation as lawyers, mediators. Originality/value-The research is anticipated to be of particular benefit to parents, society as whole, and professionals working in mediation. These results may also be helpful during further legislative initiatives in this area.

Keywords: mediation, family law, children's rights, australian and polish family law

Procedia PDF Downloads 60
39355 Single-Cell Visualization with Minimum Volume Embedding

Authors: Zhenqiu Liu

Abstract:

Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.

Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method

Procedia PDF Downloads 214
39354 Ergonomical Study of Hand-Arm Vibrational Exposure in a Gear Manufacturing Plant in India

Authors: Santosh Kumar, M. Muralidhar

Abstract:

The term ‘ergonomics’ is derived from two Greek words: ‘ergon’, meaning work and ‘nomoi’, meaning natural laws. Ergonomics is the study of how working conditions, machines and equipment can be arranged in order that people can work with them more efficiently. In this research communication an attempt has been made to study the effect of hand-arm vibrational exposure on the workers of a gear manufacturing plant by comparison of potential Carpal Tunnel Syndrome (CTS) symptoms and effect of different exposure levels of vibration on occurrence of CTS in actual industrial environment. Chi square test and correlation analysis have been considered for statistical analysis. From Chi square test, it has been found that the potential CTS symptoms occurrence is significantly dependent on the level of vibrational exposure. Data analysis indicates that 40.51% workers having potential CTS symptoms are exposed to vibration. Correlation analysis reveals that potential CTS symptoms are significantly correlated with exposure to level of vibration from handheld tools and to repetitive wrist movements.

Keywords: CTS symptoms, hand-arm vibration, ergonomics, physical tests

Procedia PDF Downloads 357
39353 Cloud Data Security Using Map/Reduce Implementation of Secret Sharing Schemes

Authors: Sara Ibn El Ahrache, Tajje-eddine Rachidi, Hassan Badir, Abderrahmane Sbihi

Abstract:

Recently, there has been increasing confidence for a favorable usage of big data drawn out from the huge amount of information deposited in a cloud computing system. Data kept on such systems can be retrieved through the network at the user’s convenience. However, the data that users send include private information, and therefore, information leakage from these data is now a major social problem. The usage of secret sharing schemes for cloud computing have lately been approved to be relevant in which users deal out their data to several servers. Notably, in a (k,n) threshold scheme, data security is assured if and only if all through the whole life of the secret the opponent cannot compromise more than k of the n servers. In fact, a number of secret sharing algorithms have been suggested to deal with these security issues. In this paper, we present a Mapreduce implementation of Shamir’s secret sharing scheme to increase its performance and to achieve optimal security for cloud data. Different tests were run and through it has been demonstrated the contributions of the proposed approach. These contributions are quite considerable in terms of both security and performance.

Keywords: cloud computing, data security, Mapreduce, Shamir's secret sharing

Procedia PDF Downloads 283
39352 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 48
39351 The Low-Cost Design and 3D Printing of Structural Knee Orthotics for Athletic Knee Injury Patients

Authors: Alexander Hendricks, Sean Nevin, Clayton Wikoff, Melissa Dougherty, Jacob Orlita, Rafiqul Noorani

Abstract:

Knee orthotics play an important role in aiding in the recovery of those with knee injuries, especially athletes. However, structural knee orthotics is often very expensive, ranging between $300 and $800. The primary reason for this project was to answer the question: can 3D printed orthotics represent a viable and cost-effective alternative to present structural knee orthotics? The primary objective for this research project was to design a knee orthotic for athletes with knee injuries for a low-cost under $100 and evaluate its effectiveness. The initial design for the orthotic was done in SolidWorks, a computer-aided design (CAD) software available at Loyola Marymount University. After this design was completed, finite element analysis (FEA) was utilized to understand how normal stresses placed upon the knee affected the orthotic. The knee orthotic was then adjusted and redesigned to meet a specified factor-of-safety of 3.25 based on the data gathered during FEA and literature sources. Once the FEA was completed and the orthotic was redesigned based from the data gathered, the next step was to move on to 3D-printing the first design of the knee brace. Subsequently, physical therapy movement trials were used to evaluate physical performance. Using the data from these movement trials, the CAD design of the brace was refined to accommodate the design requirements. The final goal of this research means to explore the possibility of replacing high-cost, outsourced knee orthotics with a readily available low-cost alternative.

Keywords: 3D printing, knee orthotics, finite element analysis, design for additive manufacturing

Procedia PDF Downloads 162
39350 Numerical Modeling for Water Engineering and Obstacle Theory

Authors: Mounir Adal, Baalal Azeddine, Afifi Moulay Larbi

Abstract:

Numerical analysis is a branch of mathematics devoted to the development of iterative matrix calculation techniques. We are searching for operations optimization as objective to calculate and solve systems of equations of order n with time and energy saving for computers that are conducted to calculate and analyze big data by solving matrix equations. Furthermore, this scientific discipline is producing results with a margin of error of approximation called rates. Thus, the results obtained from the numerical analysis techniques that are held on computer software such as MATLAB or Simulink offers a preliminary diagnosis of the situation of the environment or space targets. By this we can offer technical procedures needed for engineering or scientific studies exploitable by engineers for water.

Keywords: numerical analysis methods, obstacles solving, engineering, simulation, numerical modeling, iteration, computer, MATLAB, water, underground, velocity

Procedia PDF Downloads 442
39349 Multi-Temporal Analysis of Vegetation Change within High Contaminated Watersheds by Superfund Sites in Wisconsin

Authors: Punwath Prum

Abstract:

Superfund site is recognized publicly to be a severe environmental problem to surrounding communities and biodiversity due to its hazardous chemical waste from industrial activities. It contaminates the soil and water but also is a leading potential point-source pollution affecting ecosystem in watershed areas from chemical substances. The risks of Superfund site on watershed can be effectively measured by utilizing publicly available data and geospatial analysis by free and open source application. This study analyzed the vegetation change within high risked contaminated watersheds in Wisconsin. The high risk watersheds were measured by which watershed contained high number Superfund sites. The study identified two potential risk watersheds in Lafayette and analyzed the temporal changes of vegetation within the areas based on Normalized difference vegetation index (NDVI) analysis. The raster statistic was used to compare the change of NDVI value over the period. The analysis results showed that the NDVI value within the Superfund sites’ boundary has a significant lower value than nearby surrounding and provides an analogy for environmental hazard affect by the chemical contamination in Superfund site.

Keywords: soil contamination, spatial analysis, watershed

Procedia PDF Downloads 123
39348 Implementation Association Rule Method in Determining the Layout of Qita Supermarket as a Strategy in the Competitive Retail Industry in Indonesia

Authors: Dwipa Rizki Utama, Hanief Ibrahim

Abstract:

The development of industry retail in Indonesia is very fast, various strategy was undertaken to boost the customer satisfaction and the productivity purchases to boost the profit, one of which is implementing strategies layout. The purpose of this study is to determine the layout of Qita supermarket, a retail industry in Indonesia, in order to improve customer satisfaction and to maximize the rate of products’ sale as a whole, so as the infrequently purchased products will be purchased. This research uses a literature study method, and one of the data mining methods is association rule which applied in market basket analysis. Data were tested amounted 100 from 160 after pre-processing data, so then the distribution department and 26 departments corresponding to the data previous layout will be obtained. From those data, by the association rule method, customer behavior when purchasing items simultaneously can be studied, so then the layout of the supermarket based on customer behavior can be determined. Using the rapid miner software by the minimal support 25% and minimal confidence 30% showed that the 14th department purchased at the same time with department 10, 21st department purchased at the same time with department 13, 15th department purchased at the same time with department 12, 14th department purchased at the same time with department 12, and 10th department purchased at the same time with department 14. From those results, a better supermarket layout can be arranged than the previous layout.

Keywords: industry retail, strategy, association rule, supermarket

Procedia PDF Downloads 171
39347 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 370
39346 Challenges and Pitfalls of Nutrition Labeling Policy in Iran: A Policy Analysis

Authors: Sareh Edalati, Nasrin Omidvar, Arezoo Haghighian Roudsari, Delaram Ghodsi, Azizollaah Zargaran

Abstract:

Background and aim: Improving consumer’s food choices and providing a healthy food environment by governments is one of the essential approaches to prevent non-communicable diseases and to fulfill the sustainable development goals (SDGs). The present study aimed to provide an analysis of the nutrition labeling policy as one of the main components of the healthy food environment to provide learning lessons for the country and other low and middle-income countries. Methods: Data were collected by reviewing documents and conducting semi-structured interviews with stakeholders. Respondents were selected through purposive and snowball sampling and continued until data saturation. MAXQDA software was used to manage data analysis. A deductive content analysis was used by applying the Kingdon multiple streams and the policy triangulation framework. Results: Iran is the first country in the Middle East and North Africa region, which has implemented nutrition traffic light labeling. The implementation process has gone through two phases: voluntary and mandatory. In the voluntary labeling, volunteer food manufacturers who chose to have the labels would receive an honorary logo and this helped to reduce the food-sector resistance gradually. After this phase, the traffic light labeling became mandatory. Despite these efforts, there has been poor involvement of media for public awareness and sensitization. Also, the inconsistency of nutrition traffic light colors which are based on food standard guidelines, lack of consistency between nutrition traffic light colors, the healthy/unhealthy nature of some food products such as olive oil and diet cola and the absence of a comprehensive evaluation plan were among the pitfalls and policy challenges identified. Conclusions: Strengthening the governance through improving collaboration within health and non-health sectors for implementation, more transparency of truthfulness of nutrition traffic labeling initiating with real ingredients, and applying international and local scientific evidence or any further revision of the program is recommended. Also, developing public awareness campaigns and revising school curriculums to improve students’ skills on nutrition label applications should be highly emphasized.

Keywords: nutrition labeling, policy analysis, food environment, Iran

Procedia PDF Downloads 170
39345 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction

Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto

Abstract:

Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.

Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data

Procedia PDF Downloads 88
39344 Child Labour Issue: Practice of Enforecement of Right of the Child in Nigeria

Authors: Gift Salawa, Perkins Erhijakpor, Henry Ukwu

Abstract:

This study will explore child labour issues in Nigeria because it is capable of affecting the physical and general well-being of children who perform hazardous work. This feat will be achieved through qualitative research methodology. Data collection shall be elicited by oral interviews and documental content analysis to delve on the application of the Convention on the Rights of the Child (CRC), International Labour Organization ILO and Geneva Convention relating to child labour practices in Nigeria. This will include the relevance of present domestic laws relating to child labour as implemented in Nigeria, together with factors that contribute to the practice of child labour in the country. The oral interview data analysis will be performed by breaking the interview data into significant statements and themes. This shall be done by comparing and determining the commonalities that are prevalent in the participants’ views regarding child labour menace in Nigeria. Presumably, findings from this study shall unveil that a poor educational policy, a widespread poverty level which is mostly prevalent amongst families in the rural areas of the country, a lack of employment for adults, have led to the ineffectiveness of the local child labour laws in Nigeria. These has in turn culminated into a somewhat non-implementation of the international laws of the CRC, ILO and Geneva Declaration on child labour to which the Nigerian government is a signatory. Based on the finding, this study will calls on the government of Nigeria to extend its free educational policy from the elementary, secondary to tertiary educations. The government also has to ensure that offenders of children’s rights should face a severe punishment.

Keywords: child labour, educational policy, human right, protection right

Procedia PDF Downloads 285
39343 Using Morlet Wavelet Filter to Denoising Geoelectric ‘Disturbances’ Map of Moroccan Phosphate Deposit ‘Disturbances’

Authors: Saad Bakkali

Abstract:

Morocco is a major producer of phosphate, with an annual output of 19 million tons and reserves in excess of 35 billion cubic meters. This represents more than 75% of world reserves. Resistivity surveys have been successfully used in the Oulad Abdoun phosphate basin. A Schlumberger resistivity survey over an area of 50 hectares was carried out. A new field procedure based on analytic signal response of resistivity data was tested to deal with the presence of phosphate deposit disturbances. A resistivity map was expected to allow the electrical resistivity signal to be imaged in 2D. 2D wavelet is standard tool in the interpretation of geophysical potential field data. Wavelet transform is particularly suitable in denoising, filtering and analyzing geophysical data singularities. Wavelet transform tools are applied to analysis of a moroccan phosphate deposit ‘disturbances’. Wavelet approach applied to modeling surface phosphate “disturbances” was found to be consistently useful.

Keywords: resistivity, Schlumberger, phosphate, wavelet, Morocco

Procedia PDF Downloads 401
39342 Equilibrium, Kinetics, and Thermodynamic Studies on Heavy Metal Biosorption by Trichoderma Species

Authors: Sobia Mushtaq, Firdaus E. Bareen, Asma Tayyeb

Abstract:

This study conducted to investigate the metal biosorption potential of indigenous Trichoderma species (T. harzianum KS05T01, T. longibrachiatum KS09T03, Trichoderma sp KS17T09., T. viridi KS17T011, T. atrobruneo KS21T014, and T. citrinoviride) that have been isolated from contaminated soil of Kasur Tannery Waste Management Agency. The effect of different biosorption parameters as initial metal ion concentration, pH, contact time , and temperature of incubation was investigated on the biosorption potential of these species. The metal removal efficiency and (E%) and metal uptake capacity (mg/g) increased along with the increase of initial metal concentration in media. The Trichoderma species can tolerate and survive under heavy metal stress up to 800mg/L. Among the two isotherm models were applied on the biosorption data, Langmuir isotherm model and Freundlich isotherm model, maximum correlation coefficients values (R 2 ) of 1was found for Langmuir model, which showed the better fitted model for the Trichoderma biosorption. The metal biosorption was increased with the increase of temperature and pH of the media. The maximum biosorption was observed between 25-30 o C and at pH 6.-7.5, while the biosorption rate was increased from 3-6 days of incubation, and then the rate of biosorption was slowed down. The biosorption data was better fitted for Pseudo kinetic first order during the initial days of biosorption. Thermodynamic parameters as standard Gibbs free energy (G), standard enthalpy change (H), and standard entropy (S) were calculated. The results confirmed the heavy metal biosorption by Trichoderma species was endothermic and spontaneous reaction in nature. The FTIR spectral analysis and SEM-EDX analysis of the treated and controlled mycelium revealed the changes in the active functional sites and morphological variations of the outer surface. The data analysis envisaged that high metal tolerance exhibited by Trichoderma species indicates its potential as efficacious and successful mediator for bioremediation of the heavy metal polluted environments.

Keywords: heavy metal, fungal biomass, biosorption, kinetics

Procedia PDF Downloads 100
39341 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance

Authors: Abdullah Al Farwan, Ya Zhang

Abstract:

In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.

Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance

Procedia PDF Downloads 147
39340 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method

Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito

Abstract:

In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.

Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.

Procedia PDF Downloads 483
39339 Efforts to Revitalize Piipaash Language: An Explorative Study to Develop Culturally Appropriate and Contextually Relevant Teaching Materials for Preschoolers

Authors: Shahzadi Laibah Burq, Gina Scarpete Walters

Abstract:

Piipaash, representing one large family of North American languages, Yuman, is reported as one of the seriously endangered languages in the Salt River Pima-Maricopa Indian Community of Arizona. In a collaborative venture between Arizona State University (ASU) and Salt River Pima-Maricopa Indian Community (SRPMIC), efforts have been made to revitalize and preserve the Piipaash language and its cultural heritage. The present study is one example of several other language documentation and revitalization initiatives that Humanities Lab ASU has taken. This study was approved to receive a “Beyond the lab” grant after the researchers successfully created a Teaching Guide for Early Childhood Piipaash storybook during their time working in the Humanities Lab. The current research is an extension of the previous project and focuses on creating customized teaching materials and tools for the teachers and parents of the students of the Early Enrichment Program at SRPMIC. However, to determine and maximize the usefulness of the teaching materials with regards to their reliability, validity, and practicality in the given context, this research aims to conduct Environmental Analysis and Need Analysis. Environmental Analysis seeks to evaluate the Early Enrichment Program situation and Need Analysis to investigate the specific and situated requirements of the teachers to assist students in building target language skills. The study employs a qualitative methods approach for the collection of the data. Multiple data collection strategies are used concurrently to gather information from the participants. The research tools include semi-structured interviews with the program administrators and teachers, classroom observations, and teacher shadowing. The researchers utilize triangulation of the data to maintain validity in the process of data interpretation. The preliminary results of the study show a need for culturally appropriate materials that can further the learning of students of the target language as well as the culture, i.e., clay pots and basket-making materials. It was found that the course and teachers focus on developing the Listening and Speaking skills of the students. Moreover, to assist the young learners beyond the classroom, the teachers could make use of send-home teaching materials to reinforce the learning (i.e., coloring books, including illustrations of culturally relevant animals, food, and places). Audio language resources are also identified as helpful additional materials for the parents to assist the learning of the kids.

Keywords: indigenous education, materials development, need analysis, piipaash language revitalizaton

Procedia PDF Downloads 75
39338 Using Inverted 4-D Seismic and Well Data to Characterise Reservoirs from Central Swamp Oil Field, Niger Delta

Authors: Emmanuel O. Ezim, Idowu A. Olayinka, Michael Oladunjoye, Izuchukwu I. Obiadi

Abstract:

Monitoring of reservoir properties prior to well placements and production is a requirement for optimisation and efficient oil and gas production. This is usually done using well log analyses and 3-D seismic, which are often prone to errors. However, 4-D (Time-lapse) seismic, incorporating numerous 3-D seismic surveys of the same field with the same acquisition parameters, which portrays the transient changes in the reservoir due to production effects over time, could be utilised because it generates better resolution. There is, however dearth of information on the applicability of this approach in the Niger Delta. This study was therefore designed to apply 4-D seismic, well-log and geologic data in monitoring of reservoirs in the EK field of the Niger Delta. It aimed at locating bypassed accumulations and ensuring effective reservoir management. The Field (EK) covers an area of about 1200km2 belonging to the early (18ma) Miocene. Data covering two 4-D vintages acquired over a fifteen-year interval were obtained from oil companies operating in the field. The data were analysed to determine the seismic structures, horizons, Well-to-Seismic Tie (WST), and wavelets. Well, logs and production history data from fifteen selected wells were also collected from the Oil companies. Formation evaluation, petrophysical analysis and inversion alongside geological data were undertaken using Petrel, Shell-nDi, Techlog and Jason Software. Well-to-seismic tie, formation evaluation and saturation monitoring using petrophysical and geological data and software were used to find bypassed hydrocarbon prospects. The seismic vintages were interpreted, and the amounts of change in the reservoir were defined by the differences in Acoustic Impedance (AI) inversions of the base and the monitor seismic. AI rock properties were estimated from all the seismic amplitudes using controlled sparse-spike inversion. The estimated rock properties were used to produce AI maps. The structural analysis showed the dominance of NW-SE trending rollover collapsed-crest anticlines in EK with hydrocarbons trapped northwards. There were good ties in wells EK 27, 39. Analysed wavelets revealed consistent amplitude and phase for the WST; hence, a good match between the inverted impedance and the good data. Evidence of large pay thickness, ranging from 2875ms (11420 TVDSS-ft) to about 2965ms, were found around EK 39 well with good yield properties. The comparison between the base of the AI and the current monitor and the generated AI maps revealed zones of untapped hydrocarbons as well as assisted in determining fluids movement. The inverted sections through EK 27, 39 (within 3101 m - 3695 m), indicated depletion in the reservoirs. The extent of the present non-uniform gas-oil contact and oil-water contact movements were from 3554 to 3575 m. The 4-D seismic approach led to better reservoir characterization, well development and the location of deeper and bypassed hydrocarbon reservoirs.

Keywords: reservoir monitoring, 4-D seismic, well placements, petrophysical analysis, Niger delta basin

Procedia PDF Downloads 102
39337 Bacteriological Analysis of Logan's Branch Rowan County, Kentucky Utilizing Membrane Filtration Method

Authors: Elizabeth G. Hereford, Geoffrey W. Gearner

Abstract:

Logan’s Branch, within the Triplett Creek Watershed of Rowan County, Kentucky, is a waterway located near important agricultural and residential areas. Part of Logan’s Branch flows over an exposed black shale formation with elevated radioactivity and heavy metals. Three sites were chosen in relation to the formation and sampled five times over a thirty-day period during the recreational season. A fourth site in North Fork in Rowan County, Kentucky was also sampled periodically as it too has contact with the shale formation. These sites were then sampled monthly. All samples are analyzed for concentrations of Escherichia coli, heterotrophic bacteria, and total coliform bacteria utilizing the membrane filtration method and various culture media. Current data suggests that the radioactivity of the shale formation influences the bacteriological growth present in the waterway; however, further data will be collected and compared with that of my colleagues to confirm this trend.

Keywords: bacteriological analysis, Escherichia coli, heterotrophic bacteria, radioactive black shale formation, water quality

Procedia PDF Downloads 169
39336 A Data-Driven Agent Based Model for the Italian Economy

Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio

Abstract:

We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.

Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data

Procedia PDF Downloads 53
39335 Creative Mapping Landuse and Human Activities: From the Inventories of Factories to the History of the City and Citizens

Authors: R. Tamborrino, F. Rinaudo

Abstract:

Digital technologies offer possibilities to effectively convert historical archives into instruments of knowledge able to provide a guide for the interpretation of historical phenomena. Digital conversion and management of those documents allow the possibility to add other sources in a unique and coherent model that permits the intersection of different data able to open new interpretations and understandings. Urban history uses, among other sources, the inventories that register human activities in a specific space (e.g. cadastres, censuses, etc.). The geographic localisation of that information inside cartographic supports allows for the comprehension and visualisation of specific relationships between different historical realities registering both the urban space and the peoples living there. These links that merge the different nature of data and documentation through a new organisation of the information can suggest a new interpretation of other related events. In all these kinds of analysis, the use of GIS platforms today represents the most appropriate answer. The design of the related databases is the key to realise the ad-hoc instrument to facilitate the analysis and the intersection of data of different origins. Moreover, GIS has become the digital platform where it is possible to add other kinds of data visualisation. This research deals with the industrial development of Turin at the beginning of the 20th century. A census of factories realized just prior to WWI provides the opportunity to test the potentialities of GIS platforms for the analysis of urban landscape modifications during the first industrial development of the town. The inventory includes data about location, activities, and people. GIS is shaped in a creative way linking different sources and digital systems aiming to create a new type of platform conceived as an interface integrating different kinds of data visualisation. The data processing allows linking this information to an urban space, and also visualising the growth of the city at that time. The sources, related to the urban landscape development in that period, are of a different nature. The emerging necessity to build, enlarge, modify and join different buildings to boost the industrial activities, according to their fast development, is recorded by different official permissions delivered by the municipality and now stored in the Historical Archive of the Municipality of Turin. Those documents, which are reports and drawings, contain numerous data on the buildings themselves, including the block where the plot is located, the district, and the people involved such as the owner, the investor, and the engineer or architect designing the industrial building. All these collected data offer the possibility to firstly re-build the process of change of the urban landscape by using GIS and 3D modelling technologies thanks to the access to the drawings (2D plans, sections and elevations) that show the previous and the planned situation. Furthermore, they access information for different queries of the linked dataset that could be useful for different research and targets such as economics, biographical, architectural, or demographical. By superimposing a layer of the present city, the past meets to the present-industrial heritage, and people meet urban history.

Keywords: digital urban history, census, digitalisation, GIS, modelling, digital humanities

Procedia PDF Downloads 175
39334 Contribution of Culture on Divorce Prevention in Indonesia on "New Normal" Era: Study at Batak, Malay and Minangkabau Tribes

Authors: Ikhwanuddin Harahap

Abstract:

This paper investigates the contribution of culture to divorce prevention in Indonesia in the "new normal" era, especially in Batak, Malay and Minangkabau tribes. This research is qualitative with an anthropological approach. Data were collected by interview and observation techniques. Checking the validity of the data is done by triangulation technique, and the data is analyzed by content analysis. The results of the research showed that culture has a strategic role in preventing divorce. In Batak, Malay and Minangkabau-as, major ethnic groups in Indonesian cultures, have a set of norms and dogmas conveyed at the wedding party, namely “marriage must be eternal and if divorced by death.” In addition, cultural figures actively become arbiters in resolving family conflicts, such as Harajaon in Batak, Datuk in Malay and Mamak in Minangkabau. Cultural dogmas and cultural figures play a very important role in preventing divorce.

Keywords: culture, divorce, prevention, contribution, new normal, era

Procedia PDF Downloads 153
39333 End-User Behavior: Analysis of Their Role and Impacts on Energy Savings Achievements

Authors: Margarida Plana

Abstract:

End-users behavior has become one of the main aspects to be solved on energy efficiency projects. Especially on the residential sector, the end-users have a direct impact that affects the achievement of energy saving’s targets. This paper is focused on presenting and quantify the impact of end-users behavior on basis of the analysis of real projects’ data. The analysis study which is the role of buiding’s occupants and how their behavior can change the success of energy efficiency projects how to limit their impact. The results obtained show two main conclusions. The first one is easiest to solve: we need to control and limit the end-users interaction with the equipment operation to be able to reach the targets fixed. The second one: as the plugged equipment are increasing exponentially on the residential sector, big efforts of disseminations are needed in order to explain to citizens the impact of their day by day actions through dissemination campaigns.

Keywords: end-users impacts, energy efficiency, energy savings, impact limitations

Procedia PDF Downloads 337