Search results for: continuous data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26680

Search results for: continuous data

26200 Major Factors That Enhance Economic Growth in South Africa: A Re-Examination Using a Vector Error Correction Mechanism

Authors: Temitope L. A. Leshoro

Abstract:

This study explored several variables that enhance economic growth in South Africa, based on different growth theories while using the vector error correction model (VECM) technique. The impacts and contributions of each of these variables on GDP in South Africa were investigated. The motivation for this study was as a result of the weak economic growth that the country has been experiencing lately, as well as the continuous increase in unemployment rate and deteriorating health care system. Annual data spanning over the period 1974 to 2013 was employed. The results showed that the major determinants of GDP are trade openness, government spending, and health indicator; as these variables are not only economically significant but also statistically significant in explaining the changes in GDP in South Africa. Policy recommendations for economic growth enhancement are suggested based on the findings of this study.

Keywords: economic growth, GDP, investment, health indicator, VECM

Procedia PDF Downloads 276
26199 The Impact of the Method of Extraction on 'Chemchali' Olive Oil Composition in Terms of Oxidation Index, and Chemical Quality

Authors: Om Kalthoum Sallem, Saidakilani, Kamiliya Ounaissa, Abdelmajid Abid

Abstract:

Introduction and purposes: Olive oil is the main oil used in the Mediterranean diet. Virgin olive oil is valued for its organoleptic and nutritional characteristics and is resistant to oxidation due to its high monounsaturated fatty acid content (MUFAs), and low polyunsaturates (PUFAs) and the presence of natural antioxidants such as phenols, tocopherols and carotenoids. The fatty acid composition, especially the MUFA content, and the natural antioxidants provide advantages for health. The aim of the present study was to examine the impact of method of extraction on the chemical profiles of ‘Chemchali’ olive oil variety, which is cultivated in the city of Gafsa, and to compare it with chetoui and chemchali varieties. Methods: Our study is a qualitative prospective study that deals with ‘Chemchali’ olive oil variety. Analyses were conducted during three months (from December to February) in different oil mills in the city of Gafsa. We have compared ‘Chemchali’ olive oil obtained by continuous method to this obtained by superpress method. Then we have analyzed quality index parameters, including free fatty acid content (FFA), acidity, and UV spectrophotometric characteristics and other physico-chemical data [oxidative stability, ß-carotene, and chlorophyll pigment composition]. Results: Olive oil resulting from super press method compared with continuous method is less acid(0,6120 vs. 0,9760), less oxydazible(K232:2,478 vs. 2,592)(k270:0,216 vs. 0,228), more rich in oleic acid(61,61% vs. 66.99%), less rich in linoleic acid(13,38% vs. 13,98 %), more rich in total chlorophylls pigments (6,22 ppm vs. 3,18 ppm ) and ß-carotene (3,128 mg/kg vs. 1,73 mg/kg). ‘Chemchali’ olive oil showed more equilibrated total content in fatty acids compared with the varieties ’Chemleli’ and ‘Chetoui’. Gafsa’s variety ’Chemlali’ have significantly less saturated and polyunsaturated fatty acids. Whereas it has a higher content in monounsaturated fatty acid C18:2, compared with the two other varieties. Conclusion: The use of super press method had benefic effects on general chemical characteristics of ‘Chemchali’ olive oil, maintaining the highest quality according to the ecocert legal standards. In light of the results obtained in this study, a more detailed study is required to establish whether the differences in the chemical properties of oils are mainly due to agronomic and climate variables or, to the processing employed in oil mills.

Keywords: olive oil, extraction method, fatty acids, chemchali olive oil

Procedia PDF Downloads 383
26198 Influence of Strong Optical Feedback on Frequency Chirp and Lineshape Broadening in High-Speed Semiconductor Laser

Authors: Moustafa Ahmed, Fumio Koyama

Abstract:

Directly-modulated semiconductor lasers, including edge-emitting and vertical-cavity surface-emitting lasers, have received considerable interest recently for use in data transmitters in cost-effective high-speed data centers, metro, and access networks. Optical feedback has been proved as an efficient technique to boost the modulation bandwidth and enhance the speed of the semiconductor laser. However, both the laser linewidth and frequency chirping in directly-modulated lasers are sensitive to both intensity modulation and optical feedback. These effects along width fiber dispersion affect the transmission bit rate and distance in single-mode fiber links. In this work, we continue our recent research on directly-modulated semiconductor lasers with modulation bandwidth in the millimeter-wave band by introducing simultaneous modeling and simulations on both the frequency chirping and lineshape broadening. The lasers are operating under strong optical feedback. The model takes into account the multiple reflections of laser reflections of laser radiation in the external cavity. The analyses are given in terms of the chirp-to-modulated power ratio, and the results are shown for the possible dynamic states of continuous wave, period-1 oscillation, and chaos.

Keywords: chirp, linewidth, optical feedback, semiconductor laser

Procedia PDF Downloads 481
26197 The Relationship Between Artificial Intelligence, Data Science, and Privacy

Authors: M. Naidoo

Abstract:

Artificial intelligence often requires large amounts of good quality data. Within important fields, such as healthcare, the training of AI systems predominately relies on health and personal data; however, the usage of this data is complicated by various layers of law and ethics that seek to protect individuals’ privacy rights. This research seeks to establish the challenges AI and data sciences pose to (i) informational rights, (ii) privacy rights, and (iii) data protection. To solve some of the issues presented, various methods are suggested, such as embedding values in technological development, proper balancing of rights and interests, and others.

Keywords: artificial intelligence, data science, law, policy

Procedia PDF Downloads 106
26196 Simulation Data Summarization Based on Spatial Histograms

Authors: Jing Zhao, Yoshiharu Ishikawa, Chuan Xiao, Kento Sugiura

Abstract:

In order to analyze large-scale scientific data, research on data exploration and visualization has gained popularity. In this paper, we focus on the exploration and visualization of scientific simulation data, and define a spatial V-Optimal histogram for data summarization. We propose histogram construction algorithms based on a general binary hierarchical partitioning as well as a more specific one, the l-grid partitioning. For effective data summarization and efficient data visualization in scientific data analysis, we propose an optimal algorithm as well as a heuristic algorithm for histogram construction. To verify the effectiveness and efficiency of the proposed methods, we conduct experiments on the massive evacuation simulation data.

Keywords: simulation data, data summarization, spatial histograms, exploration, visualization

Procedia PDF Downloads 176
26195 Using Multi-Arm Bandits to Optimize Game Play Metrics and Effective Game Design

Authors: Kenny Raharjo, Ramon Lawrence

Abstract:

Game designers have the challenging task of building games that engage players to spend their time and money on the game. There are an infinite number of game variations and design choices, and it is hard to systematically determine game design choices that will have positive experiences for players. In this work, we demonstrate how multi-arm bandits can be used to automatically explore game design variations to achieve improved player metrics. The advantage of multi-arm bandits is that they allow for continuous experimentation and variation, intrinsically converge to the best solution, and require no special infrastructure to use beyond allowing minor game variations to be deployed to users for evaluation. A user study confirms that applying multi-arm bandits was successful in determining the preferred game variation with highest play time metrics and can be a useful technique in a game designer's toolkit.

Keywords: game design, multi-arm bandit, design exploration and data mining, player metric optimization and analytics

Procedia PDF Downloads 509
26194 Survey of Free-Range inhabitants of Federal University of Agriculture Abeokuta Zoological Park

Authors: Matthew Olanrewaju Ibiyomi

Abstract:

The study examined the abundance of free-range natural inhabitants of the Federal University of Agriculture, Abeokuta (FUNAAB) Zoo Park. A baseline data of free-ranging inhabitants of the Park is essential to monitor trends and institute conservation plans through unsustainable natural resources exploitation and habitat destruction. Four transects were selected across the study area. Each transect was traversed for a period of four months and observations was carried out twice a day. The Four existing tracks explored during the study were the aviary, reptile, carnivore and primate tracks. Data were analyzed using descriptive statistics. The findings from this study revealed that 8 species of natural inhabitants were identified, which were the Vervet monkey (Chlorocebuspygerythrus), Maxwell duiker(Philantombamaxwellii), Mongoose (Herpestidaespp), Bushbuck(Tragelaphusscriptus), Cobra (Najanaja), Ground squirrel (Marmotinispp), Senegal coucal(Centropus senegalensis), Black kite (Milvus migrans). The result further showed that a total of 115 animals were encountered in the primate transect, 77 animals in the carnivores transect, 46 animals in the aviary transect and 34 animals in the ungulates transect by the representative of 43.3%, 28.3%, 15.8% and 12.5% respectively. Human activities and level of disturbance were observed to have affected the abundance and distribution of animals at Funaab Zoo Park. Continuous field inventory is recommended to ascertain the dynamics of animals observed as free-range inhabitants in this study.

Keywords: abundance, ecosystem, extinction, free-range

Procedia PDF Downloads 91
26193 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany

Authors: Bara' Al-Mistarehi

Abstract:

Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.

Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure

Procedia PDF Downloads 178
26192 Colour and Curcuminoids Removal from Turmeric Wastewater Using Activated Carbon Adsorption

Authors: Nattawat Thongpraphai, Anusorn Boonpoke

Abstract:

This study aimed to determine the removal of colour and curcuminoids from turmeric wastewater using granular activated carbon (GAC) adsorption. The adsorption isotherm and kinetic behavior of colour and curcuminoids was invested using batch and fixed bed columns tests. The results indicated that the removal efficiency of colour and curcuminoids were 80.13 and 78.64%, respectively at 8 hr of equilibrium time. The adsorption isotherm of colour and curcuminoids were well fitted with the Freundlich adsorption model. The maximum adsorption capacity of colour and curcuminoids were 130 Pt-Co/g and 17 mg/g, respectively. The continuous experiment data showed that the exhaustion concentration of colour and curcuminoids occurred at 39 hr of operation time. The adsorption characteristic of colour and curcuminoids from turmeric wastewater by GAC can be described by the Thomas model. The maximum adsorption capacity obtained from kinetic approach were 39954 Pt-Co/g and 0.0516 mg/kg for colour and curcuminoids, respectively. Moreover, the decrease of colour and curcuminoids concentration during the service time showed a similar trend.

Keywords: adsorption, turmeric, colour, curcuminoids, activated carbon

Procedia PDF Downloads 424
26191 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 235
26190 Pricing, Production and Inventory Policies Manufacturing under Stochastic Demand and Continuous Prices

Authors: Masoud Rabbani, Majede Smizadeh, Hamed Farrokhi-Asl

Abstract:

We study jointly determining prices and production in a multiple period horizon under a general non-stationary stochastic demand with continuous prices. In some periods we need to increase capacity of production to satisfy demand. This paper presents a model to aid multi-period production capacity planning by quantifying the trade-off between product quality and production cost. The product quality is estimated as the statistical variation from the target performances obtained from the output tolerances of the production machines that manufacture the components. We consider different tolerance for different machines that use to increase capacity. The production cost is estimated as the total cost of owning and operating a production facility during the planning horizon.so capacity planning has cost that impact on price. Pricing products often turns out to be difficult to measure them because customers have a reservation price to pay that impact on price and demand. We decide to determine prices and production for periods after enhance capacity and consider reservation price to determine price. First we use an algorithm base on fuzzy set of the optimal objective function values to determine capacity planning by determine maximize interval from upper bound in minimum objectives and define weight for objectives. Then we try to determine inventory and pricing policies. We can use a lemma to solve a problem in MATLAB and find exact answer.

Keywords: price policy, inventory policy, capacity planning, product quality, epsilon -constraint

Procedia PDF Downloads 569
26189 Algorithms used in Spatial Data Mining GIS

Authors: Vahid Bairami Rad

Abstract:

Extracting knowledge from spatial data like GIS data is important to reduce the data and extract information. Therefore, the development of new techniques and tools that support the human in transforming data into useful knowledge has been the focus of the relatively new and interdisciplinary research area ‘knowledge discovery in databases’. Thus, we introduce a set of database primitives or basic operations for spatial data mining which are sufficient to express most of the spatial data mining algorithms from the literature. This approach has several advantages. Similar to the relational standard language SQL, the use of standard primitives will speed-up the development of new data mining algorithms and will also make them more portable. We introduced a database-oriented framework for spatial data mining which is based on the concepts of neighborhood graphs and paths. A small set of basic operations on these graphs and paths were defined as database primitives for spatial data mining. Furthermore, techniques to efficiently support the database primitives by a commercial DBMS were presented.

Keywords: spatial data base, knowledge discovery database, data mining, spatial relationship, predictive data mining

Procedia PDF Downloads 460
26188 Development of Structural Deterioration Models for Flexible Pavement Using Traffic Speed Deflectometer Data

Authors: Sittampalam Manoharan, Gary Chai, Sanaul Chowdhury, Andrew Golding

Abstract:

The primary objective of this paper is to present a simplified approach to develop the structural deterioration model using traffic speed deflectometer data for flexible pavements. Maintaining assets to meet functional performance is not economical or sustainable in the long terms, and it would end up needing much more investments for road agencies and extra costs for road users. Performance models have to be included for structural and functional predicting capabilities, in order to assess the needs, and the time frame of those needs. As such structural modelling plays a vital role in the prediction of pavement performance. A structural condition is important for the prediction of remaining life and overall health of a road network and also major influence on the valuation of road pavement. Therefore, the structural deterioration model is a critical input into pavement management system for predicting pavement rehabilitation needs accurately. The Traffic Speed Deflectometer (TSD) is a vehicle-mounted Doppler laser system that is capable of continuously measuring the structural bearing capacity of a pavement whilst moving at traffic speeds. The device’s high accuracy, high speed, and continuous deflection profiles are useful for network-level applications such as predicting road rehabilitations needs and remaining structural service life. The methodology adopted in this model by utilizing time series TSD maximum deflection (D0) data in conjunction with rutting, rutting progression, pavement age, subgrade strength and equivalent standard axle (ESA) data. Then, regression analyses were undertaken to establish a correlation equation of structural deterioration as a function of rutting, pavement age, seal age and equivalent standard axle (ESA). This study developed a simple structural deterioration model which will enable to incorporate available TSD structural data in pavement management system for developing network-level pavement investment strategies. Therefore, the available funding can be used effectively to minimize the whole –of- life cost of the road asset and also improve pavement performance. This study will contribute to narrowing the knowledge gap in structural data usage in network level investment analysis and provide a simple methodology to use structural data effectively in investment decision-making process for road agencies to manage aging road assets.

Keywords: adjusted structural number (SNP), maximum deflection (D0), equant standard axle (ESA), traffic speed deflectometer (TSD)

Procedia PDF Downloads 151
26187 Data Stream Association Rule Mining with Cloud Computing

Authors: B. Suraj Aravind, M. H. M. Krishna Prasad

Abstract:

There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.

Keywords: data stream, association rule mining, cloud computing, frequent itemsets

Procedia PDF Downloads 501
26186 Mobile Schooling for the Most Vulnerable Children on the Street: An Innovation

Authors: Md. Shakhawat Ullah Chowdhury

Abstract:

Mobile school is an innovative methodology in non-formal education to increase access to education for children during conflict through theatre for education for appropriate basic education to children during conflict. The continuous exposure to harsh environments and the nature of the lifestyles of children in conflict make them vulnerable. However, the mobile school initiative takes into consideration the mobile lifestyle of children in conflict. Schools are provided considering the pocket area of the street children with portable chalkboards, tin of books and materials as communities move. Teaching is multi-grade to ensure all children in the community benefit. The established mobile schools, while focused on basic literacy and numeracy skills according to traditions of the communities. The school teachers are selected by the community and trained by a theatre activist. These teachers continue to live and move with the community and provide continuous education for children in conflict. The model proposed a holistic team work to deliver education focused services to the street children’s pocket area where the team is mobile. The team consists of three members –an educator (theatre worker), a psychological counsellor and paramedics. The mobile team is responsible to educate street children and also play dramas which specially produce on the basis of national curriculum and awareness issues for street children. Children enjoy play and learn about life skills and basic literacy and numeracy skills which may be a pillar of humanitarian aid during conflict.

Keywords: vulnerable, children in conflict, mobile schooling, child-friendly

Procedia PDF Downloads 433
26185 Test-Retest Agreement, Random Measurement Error and Practice Effect of the Continuous Performance Test-Identical Pairs for Patients with Schizophrenia

Authors: Kuan-Wei Chen, Chien-Wei Chen, Tai-Ling Chang, Nan-Cheng Chen, Ching-Lin Hsieh, Gong-Hong Lin

Abstract:

Background and Purposes: Deficits in sustained attention are common in patients with schizophrenia. Such impairment can limit patients to effectively execute daily activities and affect the efficacy of rehabilitation. The aims of this study were to examine the test-retest agreement, random measurement error, and practice effect of the Continuous Performance Test-Identical Pairs (CPT-IP) (a commonly used sustained attention test) in patients with schizophrenia. The results can provide empirical evidence for clinicians and researchers to apply a sustained attention test with sound psychometric properties in schizophrenia patients. Methods: We recruited patients with chronic schizophrenia to be assessed twice with 1 week interval using CPT-IP. The intra-class correlation coefficient (ICC) was used to examine the test-retest agreement. The percentage of minimal detectable change (MDC%) was used to examine the random measurement error. Moreover, the standardized response mean (SRM) was used to examine the practice effect. Results: A total of 56 patients participated in this study. Our results showed that the ICC was 0.82, MDC% was 47.4%, and SRMs were 0.36 for the CPT-IP. Conclusion: Our results indicate that CPT-IP has acceptable test-retests agreement, substantial random measurement error, and small practice effect in patients with schizophrenia. Therefore, to avoid overestimating patients’ changes in sustained attention, we suggest that clinicians interpret the change scores of CPT-IP conservatively in their routine repeated assessments.

Keywords: schizophrenia, sustained attention, CPT-IP, reliability

Procedia PDF Downloads 304
26184 Assessment of Level of Sedation and Associated Factors Among Intubated Critically Ill Children in Pediatric Intensive Care Unit of Jimma University Medical Center: A Fourteen Months Prospective Observation Study, 2023

Authors: Habtamu Wolde Engudai

Abstract:

Background: Sedation can be provided to facilitate a procedure or to stabilize patients admitted in pediatric intensive care unit (PICU). Sedation is often necessary to maintain optimal care for critically ill children requiring mechanical ventilation. However, if sedation is too deep or too light, it has its own adverse effects, and hence, it is important to monitor the level of sedation and maintain an optimal level. Objectives: The objective is to assess the level of sedation and associated factors among intubated critically ill children admitted to PICU of JUMC, Jimma. Methods: A prospective observation study was conducted in the PICU of JUMC in September 2021 in 105 patients who were going to be admitted to the PICU aged less than 14 and with GCS >8. Data was collected by residents and nurses working in PICU. Data entry was done by Epi data manager (version 4.6.0.2). Statistical analysis and the creation of charts is going to be performed using SPSS version 26. Data was presented as mean, percentage and standard deviation. The assumption of logistic regression and the result of the assumption will be checked. To find potential predictors, bi-variable logistic regression was used for each predictor and outcome variable. A p value of <0.05 was considered as statistically significant. Finally, findings have been presented using figures, AOR, percentages, and a summary table. Result: in this study, 105 critically ill children had been involved who were started on continuous or intermittent forms of sedative drugs. Sedation level was assessed using a comfort scale three times per day. Based on this observation, we got a 44.8% level of suboptimal sedation at the baseline, a 36.2% level of suboptimal sedation at eight hours, and a 24.8% level of suboptimal sedation at sixteen hours. There is a significant association between suboptimal sedation and duration of stay with mechanical ventilation and the rate of unplanned extubation, which was shown by P < 0.05 using the Hosmer-Lemeshow test of goodness of fit (p> 0.44).

Keywords: level of sedation, critically ill children, Pediatric intensive care unit, Jimma university

Procedia PDF Downloads 60
26183 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques

Authors: Tosin Ige

Abstract:

Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.

Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique

Procedia PDF Downloads 172
26182 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 310
26181 Reliability-Centered Maintenance Application for the Development of Maintenance Strategy for a Cement Plant

Authors: Nabil Hameed Al-Farsi

Abstract:

This study’s main goal is to develop a model and a maintenance strategy for a cement factory called Arabian Cement Company, Rabigh Plant. The proposed work here depends on Reliability centric maintenance approach to develop a strategy and maintenance schedule that ensures increasing the reliability of the production system components, thus ensuring continuous productivity. The cost-effective maintenance of the plant’s dependability performance is the key goal of durability-based maintenance is. The cement plant consists of 7 important steps, so, developing a maintenance plan based on Reliability centric maintenance (RCM) method is made up of 10 steps accordingly starting from selecting units and data until performing and updating the model. The processing unit chosen for the analysis of this case is the calcinatory unit regarding model’s validation and the Travancore Titanium Products Ltd (TTP) using the claimed data history acquired from the maintenance department maintenance from the mentioned company. After applying the proposed model, the results of the maintenance simulation justified the plant's existing scheduled maintenance policy being reconsidered. Results represent the need for preventive maintenance for all Class A criticality equipment instead of the planned maintenance and the breakdown one for all other equipment depends on its criticality and an FMEA report. Consequently, the additional cost of preventive maintenance would be offset by the cost savings from breakdown maintenance for the remaining equipment.

Keywords: engineering, reliability, strategy, maintenance, failure modes, effects and criticality analysis (FMEA)

Procedia PDF Downloads 171
26180 Enhancing Healthcare Delivery in Low-Income Markets: An Exploration of Wireless Sensor Network Applications

Authors: Innocent Uzougbo Onwuegbuzie

Abstract:

Healthcare delivery in low-income markets is fraught with numerous challenges, including limited access to essential medical resources, inadequate healthcare infrastructure, and a significant shortage of trained healthcare professionals. These constraints lead to suboptimal health outcomes and a higher incidence of preventable diseases. This paper explores the application of Wireless Sensor Networks (WSNs) as a transformative solution to enhance healthcare delivery in these underserved regions. WSNs, comprising spatially distributed sensor nodes that collect and transmit health-related data, present opportunities to address critical healthcare needs. Leveraging WSN technology facilitates real-time health monitoring and remote diagnostics, enabling continuous patient observation and early detection of medical issues, especially in areas with limited healthcare facilities and professionals. The implementation of WSNs can enhance the overall efficiency of healthcare systems by enabling timely interventions, reducing the strain on healthcare facilities, and optimizing resource allocation. This paper highlights the potential benefits of WSNs in low-income markets, such as cost-effectiveness, increased accessibility, and data-driven decision-making. However, deploying WSNs involves significant challenges, including technical barriers like limited internet connectivity and power supply, alongside concerns about data privacy and security. Moreover, robust infrastructure and adequate training for local healthcare providers are essential for successful implementation. It further examines future directions for WSNs, emphasizing innovation, scalable solutions, and public-private partnerships. By addressing these challenges and harnessing the potential of WSNs, it is possible to revolutionize healthcare delivery and improve health outcomes in low-income markets.

Keywords: wireless sensor networks (WSNs), healthcare delivery, low-Income markets, remote patient monitoring, health data security

Procedia PDF Downloads 36
26179 Semantic Data Schema Recognition

Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia

Abstract:

The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.

Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns

Procedia PDF Downloads 418
26178 Tree Dress and the Internet of Living Things

Authors: Vibeke Sorensen, Nagaraju Thummanapalli, J. Stephen Lansing

Abstract:

Inspired by the indigenous people of Borneo, Indonesia and their traditional bark cloth, artist and professor Vibeke Sorensen executed a “digital unwrapping” of several trees in Southeast Asia using a digital panorama camera and digitally “stitched” them together for printing onto sustainable silk and fashioning into the “Tree Dress”. This dress is a symbolic “un-wrapping” and “re-wrapping” of the tree’s bark onto a person as a second skin. The “digital bark” is directly responsive to the real tree through embedded and networked electronics that connect in real-time to sensors at the physical site of the living tree. LEDs and circuits inserted into the dress display the continuous measurement of the O2 / CO2, temperature, humidity, and light conditions at the tree. It is an “Internet of Living Things” (IOLT) textile that can be worn to track and interact with it. The computer system connecting the dress and the tree converts the gas emission data at the site of the real tree into sound and music as sonification. This communicates not only the scientific data but also translates it into a poetic representation. The wearer of the garment can symbolically identify with the tree, or “become one” with it by adorning its “skin.” In this way, the wearer also becomes a human agent for the tree, bringing its actual condition to direct perception of the wearer and others who may engage it. This project is an attempt to bring greater awareness to issues of deforestation by providing a direct access to living things separated by physical distance, and hopefully, to increase empathy for them by providing a way to sense individual trees and their daily existential condition through remote monitoring of data. Further extensions to this project and related issues of sustainability include the use of recycled and alternative plant materials such as bamboo and air plants, among others.

Keywords: IOLT, sonification, sustainability, tree, wearable technology

Procedia PDF Downloads 138
26177 Enhancing Higher Education Teaching and Learning Processes: Examining How Lecturer Evaluation Make a Difference

Authors: Daniel Asiamah Ameyaw

Abstract:

This research attempts to investigate how lecturer evaluation makes a difference in enhancing higher education teaching and learning processes. The research questions to guide this research work states first as, “What are the perspectives on the difference made by evaluating academic teachers in order to enhance higher education teaching and learning processes?” and second, “What are the implications of the findings for Policy and Practice?” Data for this research was collected mainly through interviewing and partly documents review. Data analysis was conducted under the framework of grounded theory. The findings showed that for individual lecturer level, lecturer evaluation provides a continuous improvement of teaching strategies, and serves as source of data for research on teaching. At the individual student level, it enhances students learning process; serving as source of information for course selection by students; and by making students feel recognised in the educational process. At the institutional level, it noted that lecturer evaluation is useful in personnel and management decision making; it assures stakeholders of quality teaching and learning by setting up standards for lecturers; and it enables institutions to identify skill requirement and needs as a basis for organising workshops. Lecturer evaluation is useful at national level in terms of guaranteeing the competencies of graduates who then provide the needed manpower requirement of the nation. Besides, it mentioned that resource allocation to higher educational institution is based largely on quality of the programmes being run by the institution. The researcher concluded, that the findings have implications for policy and practice, therefore, higher education managers are expected to ensure that policy is implemented as planned by policy-makers so that the objectives can successfully be achieved.

Keywords: academic quality, higher education, lecturer evaluation, teaching and learning processes

Procedia PDF Downloads 143
26176 Access Control System for Big Data Application

Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud

Abstract:

Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.

Keywords: access control, security, Big Data, domain

Procedia PDF Downloads 134
26175 Effect of Transmission Distance on the Performance of Hybrid Configuration Using Non Return to Zero (NRZ) Pulse Format

Authors: Mais Wa'ad

Abstract:

The effect of transmission distance on the performance of hybrid configuration H 10-40 Gb/s with Non-Return to Zero (NRZ) pulse format, 100 GHz channel spacing, and Multiplexer/De-Multiplexer Band width (MUX/DEMUX BW) of 60 GHz has been investigated in this study. The laser Continuous Wave (CW) power launched into the modulator is set to 4 dBm. Eight neighboring DWDM channels are selected around 1550.12 nm carrying different data rates in hybrid optical communication systems travel through the same optical fiber and use the same passive and active optical modules. The simulation has been done using Optiwave Inc Optisys software. Usually, increasing distance will lead to decrease in performance; however this is not always the case, as the simulation conducted in this work, shows different system performance for each channel. This is due to differences in interaction between dispersion and non-linearity, and the differences in residual dispersion for each channel.

Keywords: dispersion and non-linearity interaction, optical hybrid configuration, multiplexer/de multiplexer bandwidth, non-return to zero, optical transmission distance, optisys

Procedia PDF Downloads 559
26174 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment

Authors: Michael Gidey Gebru

Abstract:

Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.

Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output

Procedia PDF Downloads 57
26173 Inclusion Body Refolding at High Concentration for Large-Scale Applications

Authors: J. Gabrielczyk, J. Kluitmann, T. Dammeyer, H. J. Jördening

Abstract:

High-level expression of proteins in bacteria often causes production of insoluble protein aggregates, called inclusion bodies (IB). They contain mainly one type of protein and offer an easy and efficient way to get purified protein. On the other hand, proteins in IB are normally devoid of function and therefore need a special treatment to become active. Most refolding techniques aim at diluting the solubilizing chaotropic agents. Unfortunately, optimal refolding conditions have to be found empirically for every protein. For large-scale applications, a simple refolding process with high yields and high final enzyme concentrations is still missing. The constructed plasmid pASK-IBA63b containing the sequence of fructosyltransferase (FTF, EC 2.4.1.162) from Bacillus subtilis NCIMB 11871 was transformed into E. coli BL21 (DE3) Rosetta. The bacterium was cultivated in a fed-batch bioreactor. The produced FTF was obtained mainly as IB. For refolding experiments, five different amounts of IBs were solubilized in urea buffer with protein concentration of 0.2-8.5 g/L. Solubilizates were refolded with batch or continuous dialysis. The refolding yield was determined by measuring the protein concentration of the clear supernatant before and after the dialysis. Particle size was measured by dynamic light scattering. We tested the solubilization properties of fructosyltransferase IBs. The particle size measurements revealed that the solubilization of the aggregates is achieved at urea concentration of 5M or higher and confirmed by absorption spectroscopy. All results confirm previous investigations that refolding yields are dependent upon initial protein concentration. In batch dialysis, the yields dropped from 67% to 12% and 72% to 19% for continuous dialysis, in relation to initial concentrations from 0.2 to 8.5 g/L. Often used additives such as sucrose and glycerol had no effect on refolding yields. Buffer screening indicated a significant increase in activity but also temperature stability of FTF with citrate/phosphate buffer. By adding citrate to the dialysis buffer, we were able to increase the refolding yields to 82-47% in batch and 90-74% in the continuous process. Further experiments showed that in general, higher ionic strength of buffers had major impact on refolding yields; doubling the buffer concentration increased the yields up to threefold. Finally, we achieved corresponding high refolding yields by reducing the chamber volume by 75% and the amount of buffer needed. The refolded enzyme had an optimal activity of 12.5±0.3 x104 units/g. However, detailed experiments with native FTF revealed a reaggregation of the molecules and loss in specific activity depending on the enzyme concentration and particle size. For that reason, we actually focus on developing a process of simultaneous enzyme refolding and immobilization. The results of this study show a new approach in finding optimal refolding conditions for inclusion bodies at high concentrations. Straightforward buffer screening and increase of the ionic strength can optimize the refolding yield of the target protein by 400%. Gentle removal of chaotrope with continuous dialysis increases the yields by an additional 65%, independent of the refolding buffer applied. In general time is the crucial parameter for successful refolding of solubilized proteins.

Keywords: dialysis, inclusion body, refolding, solubilization

Procedia PDF Downloads 294
26172 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: actual cost and duration, attribute selection, bridge construction, neural networks, predicting models, FANN TOOL, WEKA

Procedia PDF Downloads 134
26171 Neck Thinning Dynamics of Janus Droplets under Multiphase Interface Coupling in Cross Junction Microchannels

Authors: Jiahe Ru, Yan Pang, Zhaomiao Liu

Abstract:

Necking processes of the Janus droplet generation in the cross-junction microchannels are experimentally and theoretically investigated. The two dispersed phases that are simultaneously shear by continuous phases are liquid paraffin wax and 100cs silicone oil, in which 80% glycerin aqueous solution is used as continuous phases. According to the variation of minimum neck width and thinning rate, the necking process is divided into two stages, including the two-dimensional extrusion and the three-dimensional extrusion. In the two-dimensional extrusion stage, the evolutions of the tip extension length for the two discrete phases begin with the same trend, and then the length of liquid paraffin is larger than silicone oil. The upper and lower neck interface profiles in Janus necking process are asymmetrical when the tip extension velocity of paraffin oil is greater than that of silicone oil. In the three-dimensional extrusion stage, the neck of the liquid paraffin lags behind that of the silicone oil because of the higher surface tension, and finally, the necking fracture position gradually synchronizes. When the Janus droplets pinch off, the interfacial tension becomes positive to drive the neck thinning. The interface coupling of the three phases can cause asymmetric necking of the neck interface, which affects the necking time and, ultimately, the droplet volume. This paper mainly investigates the thinning dynamics of the liquid-liquid interface in confined microchannels. The revealed results could help to enhance the physical understanding of the droplet generation phenomenon.

Keywords: neck interface, interface coupling, janus droplets, multiphase flow

Procedia PDF Downloads 128