Search results for: LCA tools and data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27283

Search results for: LCA tools and data

26563 Developing and integrated Clinical Risk Management Model

Authors: Mohammad H. Yarmohammadian, Fatemeh Rezaei

Abstract:

Introduction: Improving patient safety in health systems is one of the main priorities in healthcare systems, so clinical risk management in organizations has become increasingly significant. Although several tools have been developed for clinical risk management, each has its own limitations. Aims: This study aims to develop a comprehensive tool that can complete the limitations of each risk assessment and management tools with the advantage of other tools. Methods: Procedure was determined in two main stages included development of an initial model during meetings with the professors and literature review, then implementation and verification of final model. Subjects and Methods: This study is a quantitative − qualitative research. In terms of qualitative dimension, method of focus groups with inductive approach is used. To evaluate the results of the qualitative study, quantitative assessment of the two parts of the fourth phase and seven phases of the research was conducted. Purposive and stratification sampling of various responsible teams for the selected process was conducted in the operating room. Final model verified in eight phases through application of activity breakdown structure, failure mode and effects analysis (FMEA), healthcare risk priority number (RPN), root cause analysis (RCA), FT, and Eindhoven Classification model (ECM) tools. This model has been conducted typically on patients admitted in a day-clinic ward of a public hospital for surgery in October 2012 to June. Statistical Analysis Used: Qualitative data analysis was done through content analysis and quantitative analysis done through checklist and edited RPN tables. Results: After verification the final model in eight-step, patient's admission process for surgery was developed by focus discussion group (FDG) members in five main phases. Then with adopted methodology of FMEA, 85 failure modes along with its causes, effects, and preventive capabilities was set in the tables. Developed tables to calculate RPN index contain three criteria for severity, two criteria for probability, and two criteria for preventability. Tree failure modes were above determined significant risk limitation (RPN > 250). After a 3-month period, patient's misidentification incidents were the most frequent reported events. Each RPN criterion of misidentification events compared and found that various RPN number for tree misidentification reported events could be determine against predicted score in previous phase. Identified root causes through fault tree categorized with ECM. Wrong side surgery event was selected by focus discussion group to purpose improvement action. The most important causes were lack of planning for number and priority of surgical procedures. After prioritization of the suggested interventions, computerized registration system in health information system (HIS) was adopted to prepare the action plan in the final phase. Conclusion: Complexity of health care industry requires risk managers to have a multifaceted vision. Therefore, applying only one of retrospective or prospective tools for risk management does not work and each organization must provide conditions for potential application of these methods in its organization. The results of this study showed that the integrated clinical risk management model can be used in hospitals as an efficient tool in order to improve clinical governance.

Keywords: failure modes and effective analysis, risk management, root cause analysis, model

Procedia PDF Downloads 247
26562 Geographic Information System Cloud for Sustainable Digital Water Management: A Case Study

Authors: Mohamed H. Khalil

Abstract:

Water is one of the most crucial elements which influence human lives and development. Noteworthy, over the last few years, GIS plays a significant role in optimizing water management systems, especially after exponential developing in this sector. In this context, the Egyptian government initiated an advanced ‘GIS-Web Based System’. This system is efficiently designed to tangibly assist and optimize the complement and integration of data between departments of Call Center, Operation and Maintenance, and laboratory. The core of this system is a unified ‘Data Model’ for all the spatial and tabular data of the corresponding departments. The system is professionally built to provide advanced functionalities such as interactive data collection, dynamic monitoring, multi-user editing capabilities, enhancing data retrieval, integrated work-flow, different access levels, and correlative information record/track. Noteworthy, this cost-effective system contributes significantly not only in the completeness of the base-map (93%), the water network (87%) in high level of details GIS format, enhancement of the performance of the customer service, but also in reducing the operating costs/day-to-day operations (~ 5-10 %). In addition, the proposed system facilitates data exchange between different departments (Call Center, Operation and Maintenance, and laboratory), which allowed a better understanding/analyzing of complex situations. Furthermore, this system reflected tangibly on: (i) dynamic environmental monitor/water quality indicators (ammonia, turbidity, TDS, sulfate, iron, pH, etc.), (ii) improved effectiveness of the different water departments, (iii) efficient deep advanced analysis, (iv) advanced web-reporting tools (daily, weekly, monthly, quarterly, and annually), (v) tangible planning synthesizing spatial and tabular data; and finally, (vi) scalable decision support system. It is worth to highlight that the proposed future plan (second phase) of this system encompasses scalability will extend to include integration with departments of Billing and SCADA. This scalability will comprise advanced functionalities in association with the existing one to allow further sustainable contributions.

Keywords: GIS Web-Based, base-map, water network, decision support system

Procedia PDF Downloads 89
26561 Estimation of Service Quality and Its Impact on Market Share Using Business Analytics

Authors: Haritha Saranga

Abstract:

Service quality has become an important driver of competition in manufacturing industries of late, as many products are being sold in conjunction with service offerings. With increase in computational power and data capture capabilities, it has become possible to analyze and estimate various aspects of service quality at the granular level and determine their impact on business performance. In the current study context, dealer level, model-wise warranty data from one of the top two-wheeler manufacturers in India is used to estimate service quality of individual dealers and its impact on warranty related costs and sales performance. We collected primary data on warranty costs, number of complaints, monthly sales, type of quality upgrades, etc. from the two-wheeler automaker. In addition, we gathered secondary data on various regions in India, such as petrol and diesel prices, geographic and climatic conditions of various regions where the dealers are located, to control for customer usage patterns. We analyze this primary and secondary data with the help of a variety of analytics tools such as Auto-Regressive Integrated Moving Average (ARIMA), Seasonal ARIMA and ARIMAX. Study results, after controlling for a variety of factors, such as size, age, region of the dealership, and customer usage pattern, show that service quality does influence sales of the products in a significant manner. A more nuanced analysis reveals the dynamics between product quality and service quality, and how their interaction affects sales performance in the Indian two-wheeler industry context. We also provide various managerial insights using descriptive analytics and build a model that can provide sales projections using a variety of forecasting techniques.

Keywords: service quality, product quality, automobile industry, business analytics, auto-regressive integrated moving average

Procedia PDF Downloads 117
26560 A Critical Look on Clustered Regularly Interspaced Short Palindromic Repeats Method Based on Different Mechanisms

Authors: R. Sulakshana, R. Lakshmi

Abstract:

Clustered Regularly Interspaced Short Palindromic Repeats, CRISPR associate (CRISPR/Cas) is an adaptive immunity system found in bacteria and archaea. It has been modified to serve as a potent gene editing tool. Moreover, it has found widespread use in the field of genome research because of its accessibility and low cost. Several bioinformatics methods have been created to aid in the construction of specific single guide RNA (sgRNA), which is highly active and crucial to CRISPR/Cas performance. Various Cas proteins, including Cas1, Cas2, Cas9, and Cas12, have been used to create genome engineering tools because of their programmable sequence specificity. Class 1 and 2 CRISPR/Cas systems, as well as the processes of all known Cas proteins (including Cas9 and Cas12), are discussed in this review paper. In addition, the various CRISPR methodologies and their tools so far discovered are discussed. Finally, the challenges and issues in the CRISPR system along with future works, are presented.

Keywords: gene editing tool, Cas proteins, CRISPR, guideRNA, programmable sequence

Procedia PDF Downloads 103
26559 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Angel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors’. The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: demand forecasting, empirical distribution, propagation of error, Bogota

Procedia PDF Downloads 627
26558 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection

Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang

Abstract:

To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.

Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method

Procedia PDF Downloads 363
26557 Data Stream Association Rule Mining with Cloud Computing

Authors: B. Suraj Aravind, M. H. M. Krishna Prasad

Abstract:

There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.

Keywords: data stream, association rule mining, cloud computing, frequent itemsets

Procedia PDF Downloads 498
26556 Open-Source YOLO CV For Detection of Dust on Solar PV Surface

Authors: Jeewan Rai, Kinzang, Yeshi Jigme Choden

Abstract:

Accumulation of dust on solar panels impacts the overall efficiency and the amount of energy they produce. While various techniques exist for detecting dust to schedule cleaning, many of these methods use MATLAB image processing tools and other licensed software, which can be financially burdensome. This study will investigate the efficiency of a free open-source computer vision library using the YOLO algorithm. The proposed approach has been tested on images of solar panels with varying dust levels through an experiment setup. The experimental findings illustrated the effectiveness of using the YOLO-based image classification method and the overall dust detection approach with an accuracy of 90% in distinguishing between clean and dusty panels. This open-source solution provides a cost effective and accessible alternative to commercial image processing tools, offering solutions for optimizing solar panel maintenance and enhancing energy production.

Keywords: YOLO, openCV, dust detection, solar panels, computer vision, image processing

Procedia PDF Downloads 26
26555 Physical, Psychological, and Sexual Implications of Living with Rheumatoid Arthritis among Women in Re

Authors: Anwaar Anwar Tayel

Abstract:

Background: Rheumatic arthritis (RA) affect all aspects of patients' life, lead to various degrees of disability, and ultimately has a profound impact on the social, economic, psychological, and sexual aspects of the patient's life. Aim of the study: Identify physical, psychological, and sexual implications of rheumatoid arthritis among women in reproductive age. In addition to investigating the correlations between physical functional disability, psychological problems, and sexual dysfunction.Settings: The study was conducted at Rheumatology Clinic at the Main University Hospital of Alexandria. Subjects: Purposive sample was chosen from women patients with rheumatoid arthritis to be subjects of this study (n=250). Tools: Four tools were used to collect data. Tool I: Socio-demographic questionnaire. Tool II: Stanford Health Assessment Questionnaire Disability Index (HAQ- DI). Tool III: Depression Anxiety Stress Scale (DASS). Tool IV: The Sexual Dysfunction Questionnaire (SDQ) Results: The majority of the studied women suffer from severe physical disability, extreme level of depression, anxiety, and about half of them had an extreme level of stress. Also, the majority of the studied women had a severe level of sexual dysfunction. Also, statistically significant correlations between women's physical disability index, psychological problems, and sexual dysfunction were detected. Conclusion: The findings from this study confirm that women patients with RA suffer from multiple negative implications on the physical, psychological and sexual functions. Recommendations: Provide ongoing support to the patients from the time of diagnosis throughout their care and management. To help them to manage their pain and disabilities, improve their sexual function, promote their mental health, and optimize psychosocial functioning

Keywords: pysical, spycholgical, sexual, implication, rheumatic arthritis

Procedia PDF Downloads 128
26554 Assessment of Environmental Quality of an Urban Setting

Authors: Namrata Khatri

Abstract:

The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.

Keywords: environmental quality, UEQ, remote sensing, GIS

Procedia PDF Downloads 79
26553 The Adoption of Leagility in Healthcare Services

Authors: Ana L. Martins, Luis Orfão

Abstract:

Healthcare systems have been subject to various research efforts aiming at process improvement under a lean approach. Another perspective, agility, has also been used, though in a lower scale, in order to analyse the ability of different hospital services to adapt to demand uncertainties. Both perspectives have a common denominator, the improvement of effectiveness and efficiency of the services in a healthcare setting context. Mixing the two approached allows, on one hand, to streamline the processes, and on the other hand the required flexibility to deal with demand uncertainty in terms of both volume and variety. The present research aims to analyse the impacts of the combination of both perspectives in the effectiveness and efficiency of an hospital service. The adopted methodology is based on a case study approach applied to the process of the ambulatory surgery service of Hospital de Lamego. Data was collected from direct observations, formal interviews and informal conversations. The analyzed process was selected according to three criteria: relevance of the process to the hospital, presence of human resources, and presence of waste. The customer of the process was identified as well as his perception of value. The process was mapped using flow chart, on a process modeling perspective, as well as through the use of Value Stream Mapping (VSM) and Process Activity Mapping. The Spaghetti Diagram was also used to assess flow intensity. The use of the lean tools enabled the identification of three main types of waste: movement, resource inefficiencies and process inefficiencies. From the use of the lean tools improvement suggestions were produced. The results point out that leagility cannot be applied to the process, but the application of lean and agility in specific areas of the process would bring benefits in both efficiency and effectiveness, and contribute to value creation if improvements are introduced in hospital’s human resources and facilities management.

Keywords: case study, healthcare systems, leagility, lean management

Procedia PDF Downloads 198
26552 Developing Digital Skills in Museum Professionals through Digital Education: International Good Practices and Effective Learning Experiences

Authors: Antonella Poce, Deborah Seid Howes, Maria Rosaria Re, Mara Valente

Abstract:

The Creative Industries education contexts, Museum Education in particular, generally presents a low emphasis on the use of new digital technologies, digital abilities and transversal skills development. The spread of the Covid-19 pandemic has underlined the importance of these abilities and skills in cultural heritage education contexts: gaining digital skills, museum professionals will improve their career opportunities with access to new distribution markets through internet access and e-commerce, new entrepreneurial tools, or adding new forms of digital expression to their work. However, the use of web, mobile, social, and analytical tools is becoming more and more essential in the Heritage field, and museums, in particular, to face the challenges posed by the current worldwide health emergency. Recent studies highlight the need for stronger partnerships between the cultural and creative sectors, social partners and education and training providers in order to provide these sectors with the combination of skills needed for creative entrepreneurship in a rapidly changing environment. Considering the above conditions, the paper presents different examples of digital learning experiences carried out in Italian and USA contexts with the aim of promoting digital skills in museum professionals. In particular, a quali-quantitative research study has been conducted on two international Postgraduate courses, “Advanced Studies in Museum Education” (2 years) and “Museum Education” (1 year), in order to identify the educational effectiveness of the online learning strategies used (e.g., OBL, Digital Storytelling, peer evaluation) for the development of digital skills and the acquisition of specific content. More than 50 museum professionals participating in the mentioned educational pathways took part in the learning activity, providing evaluation data useful for research purposes.

Keywords: digital skills, museum professionals, technology, education

Procedia PDF Downloads 173
26551 Consumer Load Profile Determination with Entropy-Based K-Means Algorithm

Authors: Ioannis P. Panapakidis, Marios N. Moschakis

Abstract:

With the continuous increment of smart meter installations across the globe, the need for processing of the load data is evident. Clustering-based load profiling is built upon the utilization of unsupervised machine learning tools for the purpose of formulating the typical load curves or load profiles. The most commonly used algorithm in the load profiling literature is the K-means. While the algorithm has been successfully tested in a variety of applications, its drawback is the strong dependence in the initialization phase. This paper proposes a novel modified form of the K-means that addresses the aforementioned problem. Simulation results indicate the superiority of the proposed algorithm compared to the K-means.

Keywords: clustering, load profiling, load modeling, machine learning, energy efficiency and quality

Procedia PDF Downloads 161
26550 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges

Authors: Jonathan Nash, Allen Hartt, Catherine Plante

Abstract:

In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.

Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting

Procedia PDF Downloads 99
26549 Polarity Classification of Social Media Comments in Turkish

Authors: Migena Ceyhan, Zeynep Orhan, Dimitrios Karras

Abstract:

People in modern societies are continuously sharing their experiences, emotions, and thoughts in different areas of life. The information reaches almost everyone in real-time and can have an important impact in shaping people’s way of living. This phenomenon is very well recognized and advantageously used by the market representatives, trying to earn the most from this means. Given the abundance of information, people and organizations are looking for efficient tools that filter the countless data into important information, ready to analyze. This paper is a modest contribution in this field, describing the process of automatically classifying social media comments in the Turkish language into positive or negative. Once data is gathered and preprocessed, feature sets of selected single words or groups of words are build according to the characteristics of language used in the texts. These features are used later to train, and test a system according to different machine learning algorithms (Naïve Bayes, Sequential Minimal Optimization, J48, and Bayesian Linear Regression). The resultant high accuracies can be important feedback for decision-makers to improve the business strategies accordingly.

Keywords: feature selection, machine learning, natural language processing, sentiment analysis, social media reviews

Procedia PDF Downloads 144
26548 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques

Authors: Tosin Ige

Abstract:

Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.

Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique

Procedia PDF Downloads 166
26547 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 306
26546 Impact of Extended Enterprise Resource Planning in the Context of Cloud Computing on Industries and Organizations

Authors: Gholamreza Momenzadeh, Forough Nematolahi

Abstract:

The Extended Enterprise Resource Planning (ERPII) system usually requires massive amounts of storage space, powerful servers, and large upfront and ongoing investments to purchase and manage the software and the related hardware which are not affordable for organizations. In recent decades, organizations prefer to adapt their business structures with new technologies for remaining competitive in the world economy. Therefore, cloud computing (which is one of the tools of information technology (IT)) is a modern system that reveals the next-generation application architecture. Also, cloud computing has had some advantages that reduce costs in many ways such as: lower upfront costs for all computing infrastructure and lower cost of maintaining and supporting. On the other hand, traditional ERPII is not responding for huge amounts of data and relations between the organizations. In this study, based on a literature study, ERPII is investigated in the context of cloud computing where the organizations operate more efficiently. Also, ERPII conditions have a response to needs of organizations in large amounts of data and relations between the organizations.

Keywords: extended enterprise resource planning, cloud computing, business process, enterprise information integration

Procedia PDF Downloads 218
26545 A Study of Native Speaker Teachers’ Competency and Achievement of Thai Students

Authors: Pimpisa Rattanadilok Na Phuket

Abstract:

This research study aims to examine: 1) teaching competency of the native English-speaking teacher (NEST) 2) the English language learning achievement of Thai students, and 3) students’ perceptions toward their NEST. The population considered in this research was a group of 39 undergraduate students of the academic year 2013. The tools consisted of a questionnaire employed to measure the level of competency of NEST, pre-test and post-test used to examine the students’ achievement on English pronunciation, and an interview used to discover how participants perceived their NEST. The data was statistically analysed as percentage, mean, standard deviation and One-sample-t-test. In addition, the data collected by interviews was qualitatively analyzed. The research study found that the level of teaching competency of native speaker teachers of English was mostly low, the English pronunciation achievement of students had increased significantly at the level of 0.5, and the students’ perception toward NEST is combined. The students perceived their NEST as an English expertise, but they felt that NEST had not recognized students' linguistic difficulty and cultural differences.

Keywords: competency, native English-speaking teacher (NET), English teaching, learning achievement

Procedia PDF Downloads 373
26544 Neural Networks Based Prediction of Long Term Rainfall: Nine Pilot Study Zones over the Mediterranean Basin

Authors: Racha El Kadiri, Mohamed Sultan, Henrique Momm, Zachary Blair, Rachel Schultz, Tamer Al-Bayoumi

Abstract:

The Mediterranean Basin is a very diverse region of nationalities and climate zones, with a strong dependence on agricultural activities. Predicting long term (with a lead of 1 to 12 months) rainfall, and future droughts could contribute in a sustainable management of water resources and economical activities. In this study, an integrated approach was adopted to construct predictive tools with lead times of 0 to 12 months to forecast rainfall amounts over nine subzones of the Mediterranean Basin region. The following steps were conducted: (1) acquire, assess and intercorrelate temporal remote sensing-based rainfall products (e.g. The CPC Merged Analysis of Precipitation [CMAP]) throughout the investigation period (1979 to 2016), (2) acquire and assess monthly values for all of the climatic indices influencing the regional and global climatic patterns (e.g., Northern Atlantic Oscillation [NOI], Southern Oscillation Index [SOI], and Tropical North Atlantic Index [TNA]); (3) delineate homogenous climatic regions and select nine pilot study zones, (4) apply data mining methods (e.g. neural networks, principal component analyses) to extract relationships between the observed rainfall and the controlling factors (i.e. climatic indices with multiple lead-time periods) and (5) use the constructed predictive tools to forecast monthly rainfall and dry and wet periods. Preliminary results indicate that rainfall and dry/wet periods were successfully predicted with lead zones of 0 to 12 months using the adopted methodology, and that the approach is more accurately applicable in the southern Mediterranean region.

Keywords: rainfall, neural networks, climatic indices, Mediterranean

Procedia PDF Downloads 310
26543 Hydraulic Analysis of Irrigation Approach Channel Using HEC-RAS Model

Authors: Muluegziabher Semagne Mekonnen

Abstract:

This study was intended to show the irrigation water requirements and evaluation of canal hydraulics steady state conditions to improve on scheme performance of the Meki-Ziway irrigation project. The methodology used was the CROPWAT 8.0 model to estimate the irrigation water requirements of five major crops irrigated in the study area. The results showed that for the whole existing and potential irrigation development area of 2000 ha and 2599 ha, crop water requirements were 3,339,200 and 4,339,090.4 m³, respectively. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. Hydraulic simulation models are fundamental tools for understanding the hydraulic flow characteristics of irrigation systems. In this study Hydraulic Analysis of Irrigation Canals Using HEC-RAS Model was conducted in Meki-Ziway Irrigation Scheme. The HEC-RAS model was tested in terms of error estimation and used to determine canal capacity potential.

Keywords: HEC-RAS, irrigation, hydraulic. canal reach, capacity

Procedia PDF Downloads 56
26542 Using Reservoir Models for Monitoring Geothermal Surface Features

Authors: John P. O’Sullivan, Thomas M. P. Ratouis, Michael J. O’Sullivan

Abstract:

As the use of geothermal energy grows internationally more effort is required to monitor and protect areas with rare and important geothermal surface features. A number of approaches are presented for developing and calibrating numerical geothermal reservoir models that are capable of accurately representing geothermal surface features. The approaches are discussed in the context of cases studies of the Rotorua geothermal system and the Orakei-korako geothermal system, both of which contain important surface features. The results show that models are able to match the available field data accurately and hence can be used as valuable tools for predicting the future response of the systems to changes in use.

Keywords: geothermal reservoir models, surface features, monitoring, TOUGH2

Procedia PDF Downloads 408
26541 Semantic Data Schema Recognition

Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia

Abstract:

The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.

Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns

Procedia PDF Downloads 414
26540 Achieving Success in NPD Projects

Authors: Ankush Agrawal, Nadia Bhuiyan

Abstract:

The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process.

Keywords: new product development, performance, critical success factors, framework

Procedia PDF Downloads 396
26539 Features of Normative and Pathological Realizations of Sibilant Sounds for Computer-Aided Pronunciation Evaluation in Children

Authors: Zuzanna Miodonska, Michal Krecichwost, Pawel Badura

Abstract:

Sigmatism (lisping) is a speech disorder in which sibilant consonants are mispronounced. The diagnosis of this phenomenon is usually based on the auditory assessment. However, the progress in speech analysis techniques creates a possibility of developing computer-aided sigmatism diagnosis tools. The aim of the study is to statistically verify whether specific acoustic features of sibilant sounds may be related to pronunciation correctness. Such knowledge can be of great importance while implementing classifiers and designing novel tools for automatic sibilants pronunciation evaluation. The study covers analysis of various speech signal measures, including features proposed in the literature for the description of normative sibilants realization. Amplitudes and frequencies of three fricative formants (FF) are extracted based on local spectral maxima of the friction noise. Skewness, kurtosis, four normalized spectral moments (SM) and 13 mel-frequency cepstral coefficients (MFCC) with their 1st and 2nd derivatives (13 Delta and 13 Delta-Delta MFCC) are included in the analysis as well. The resulting feature vector contains 51 measures. The experiments are performed on the speech corpus containing words with selected sibilant sounds (/ʃ, ʒ/) pronounced by 60 preschool children with proper pronunciation or with natural pathologies. In total, 224 /ʃ/ segments and 191 /ʒ/ segments are employed in the study. The Mann-Whitney U test is employed for the analysis of stigmatism and normative pronunciation. Statistically, significant differences are obtained in most of the proposed features in children divided into these two groups at p < 0.05. All spectral moments and fricative formants appear to be distinctive between pathology and proper pronunciation. These metrics describe the friction noise characteristic for sibilants, which makes them particularly promising for the use in sibilants evaluation tools. Correspondences found between phoneme feature values and an expert evaluation of the pronunciation correctness encourage to involve speech analysis tools in diagnosis and therapy of sigmatism. Proposed feature extraction methods could be used in a computer-assisted stigmatism diagnosis or therapy systems.

Keywords: computer-aided pronunciation evaluation, sigmatism diagnosis, speech signal analysis, statistical verification

Procedia PDF Downloads 298
26538 Multi-Objective Variable Neighborhood Search Algorithm to Solving Scheduling Problem with Transportation Times

Authors: Majid Khalili

Abstract:

This paper deals with a bi-objective hybrid no-wait flowshop scheduling problem minimizing the makespan and total weighted tardiness, in which we consider transportation times between stages. Obtaining an optimal solution for this type of complex, large-sized problem in reasonable computational time by using traditional approaches and optimization tools is extremely difficult. This paper presents a new multi-objective variable neighborhood algorithm (MOVNS). A set of experimental instances are carried out to evaluate the algorithm by advanced multi-objective performance measures. The algorithm is carefully evaluated for its performance against available algorithm by means of multi-objective performance measures and statistical tools. The related results show that a variant of our proposed MOVNS provides sound performance comparing with other algorithms.

Keywords: no-wait hybrid flowshop scheduling; multi-objective variable neighborhood algorithm; makespan; total weighted tardiness

Procedia PDF Downloads 415
26537 Access Control System for Big Data Application

Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud

Abstract:

Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.

Keywords: access control, security, Big Data, domain

Procedia PDF Downloads 132
26536 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment

Authors: Michael Gidey Gebru

Abstract:

Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.

Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output

Procedia PDF Downloads 54
26535 Autonomy not Automation: Using Metacognitive Skills in ESL/EFL Classes

Authors: Marina Paula Carreira Rolim

Abstract:

In order to have ELLs take responsibility for their own learning, it is important that they develop skills to work their studies strategically. The less they rely on the instructor as the content provider, the more they become active learners and have a higher sense of self-regulation and confidence in the learning process. This e-poster proposes a new teacher-student relationship that encourages learners to reflect, think critically, and act upon their realities. It also suggests the implementation of different autonomy-supportive teaching tools, such as portfolios, written journals, problem-solving activities, and strategy-based discussions in class. These teaching tools enable ELLs to develop awareness of learning strategies, learning styles, study plans, and available learning resources as means to foster their creative power of learning outside of classroom. In the role of a learning advisor, the teacher is no longer the content provider but a facilitator that introduces skills such as ‘elaborating’, ‘planning’, ‘monitoring’, and ‘evaluating’. The teacher acts as an educator and promotes the use of lifelong metacognitive skills to develop learner autonomy in the ESL/EFL context.

Keywords: autonomy, metacognitive skills, self-regulation, learning strategies, reflection

Procedia PDF Downloads 363
26534 The Acceptable Roles of Artificial Intelligence in the Judicial Reasoning Process

Authors: Sonia Anand Knowlton

Abstract:

There are some cases where we as a society feel deeply uncomfortable with the use of Artificial Intelligence (AI) tools in the judicial decision-making process, and justifiably so. A perfect example is COMPAS, an algorithmic model that predicts recidivism rates of offenders to assist in the determination of their bail conditions. COMPAS turned out to be extremely racist: it massively overpredicted recidivism rates of Black offenders and underpredicted recidivism rates of white offenders. At the same time, there are certain uses of AI in the judicial decision-making process that many would feel more comfortable with and even support. Take, for example, a “super-breathalyzer,” an (albeit imaginary) tool that uses AI to deliver highly detailed information about the subject of the breathalyzer test to the legal decision-makers analyzing their drunk-driving case. This article evaluates the point at which a judge’s use of AI tools begins to undermine the public’s trust in the administration of justice. It argues that the answer to this question depends on whether the AI tool is in a role in which it must perform a moral evaluation of a human being.

Keywords: artificial intelligence, judicial reasoning, morality, technology, algorithm

Procedia PDF Downloads 75