Search results for: minimum data set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26544

Search results for: minimum data set

23394 A Real Time Development Study for Automated Centralized Remote Monitoring System at Royal Belum Forest

Authors: Amri Yusoff, Shahrizuan Shafiril, Ashardi Abas, Norma Che Yusoff

Abstract:

Nowadays, illegal logging has been causing much effect to our forest. Some of it causes a flash flood, avalanche, global warming, and etc. This comprehensibly makes us wonder why, what, and who has made it happened. Often, it already has been too late after we have known the cause of it. Even the Malaysian Royal Belum forest has not been spared from land clearing or illegal activity by the natives although this area has been gazetted as a protected area preserved for future generations. Furthermore, because of its sizeable and wide area, these illegal activities are difficult to monitor and to maintain. A critical action must be called upon to prevent all of these unhealthy activities from recurrence. Therefore, a remote monitoring device must be developed in order to capture critical real-time data such as temperature, humidity, gaseous, fire, and rain detection which indicates the current and preserved natural state and habitat in the forest. Besides, this device location can be detected via GPS by showing the latitudes and longitudes of its current location and then to be transmitted by SMS via GSM system. All of its readings will be sent in real-time for data management and analysis. This result will be benefited to the monitoring bodies or relevant authority in keeping the forest in the natural habitat. Furthermore, this research is to gather a unified data and then will be analysed for its comparison with an existing method.

Keywords: remote monitoring system, forest data, GSM, GPS, wireless sensor

Procedia PDF Downloads 417
23393 Exploring the Inter-firm Collaborating and Supply Chain Innovation in the Pharmaceutical Industry

Authors: Fatima Gouiferda

Abstract:

Uncertainty and competitiveness are changing firm’s environment to become more complicated. The competition is moving to supply chain’s level, and firms need to collaborate and innovate to survive. In the current economy, common efforts between organizations and developing new capacities mutually are the key resources in gaining collaborative advantage and enhancing supply chain performance. The purpose of this paper is to explore different practices of collaboration activities that exist in the pharmaceutical industry of Morocco. Also, to inquire how these practices affect supply chain performance. The exploration is based on interpretativism research paradigm. Data were collected through semi-structured interviews from supply chain practitioners. Qualitative data was analyzed via Iramuteq software to explore different themes of the study.The findings include descriptive analysis as a result of data processing using Iramuteq. It also encompasses the content analysis of the themes extracted from interviews.

Keywords: inter-firm relationships, collaboration, supply chain innovation, morocco

Procedia PDF Downloads 63
23392 Beam Coding with Orthogonal Complementary Golay Codes for Signal to Noise Ratio Improvement in Ultrasound Mammography

Authors: Y. Kumru, K. Enhos, H. Köymen

Abstract:

In this paper, we report the experimental results on using complementary Golay coded signals at 7.5 MHz to detect breast microcalcifications of 50 µm size. Simulations using complementary Golay coded signals show perfect consistence with the experimental results, confirming the improved signal to noise ratio for complementary Golay coded signals. For improving the success on detecting the microcalcifications, orthogonal complementary Golay sequences having cross-correlation for minimum interference are used as coded signals and compared to tone burst pulse of equal energy in terms of resolution under weak signal conditions. The measurements are conducted using an experimental ultrasound research scanner, Digital Phased Array System (DiPhAS) having 256 channels, a phased array transducer with 7.5 MHz center frequency and the results obtained through experiments are validated by Field-II simulation software. In addition, to investigate the superiority of coded signals in terms of resolution, multipurpose tissue equivalent phantom containing series of monofilament nylon targets, 240 µm in diameter, and cyst-like objects with attenuation of 0.5 dB/[MHz x cm] is used in the experiments. We obtained ultrasound images of monofilament nylon targets for the evaluation of resolution. Simulation and experimental results show that it is possible to differentiate closely positioned small targets with increased success by using coded excitation in very weak signal conditions.

Keywords: coded excitation, complementary golay codes, DiPhAS, medical ultrasound

Procedia PDF Downloads 263
23391 Grid and Market Integration of Large Scale Wind Farms using Advanced Predictive Data Mining Techniques

Authors: Umit Cali

Abstract:

The integration of intermittent energy sources like wind farms into the electricity grid has become an important challenge for the utilization and control of electric power systems, because of the fluctuating behaviour of wind power generation. Wind power predictions improve the economic and technical integration of large amounts of wind energy into the existing electricity grid. Trading, balancing, grid operation, controllability and safety issues increase the importance of predicting power output from wind power operators. Therefore, wind power forecasting systems have to be integrated into the monitoring and control systems of the transmission system operator (TSO) and wind farm operators/traders. The wind forecasts are relatively precise for the time period of only a few hours, and, therefore, relevant with regard to Spot and Intraday markets. In this work predictive data mining techniques are applied to identify a statistical and neural network model or set of models that can be used to predict wind power output of large onshore and offshore wind farms. These advanced data analytic methods helps us to amalgamate the information in very large meteorological, oceanographic and SCADA data sets into useful information and manageable systems. Accurate wind power forecasts are beneficial for wind plant operators, utility operators, and utility customers. An accurate forecast allows grid operators to schedule economically efficient generation to meet the demand of electrical customers. This study is also dedicated to an in-depth consideration of issues such as the comparison of day ahead and the short-term wind power forecasting results, determination of the accuracy of the wind power prediction and the evaluation of the energy economic and technical benefits of wind power forecasting.

Keywords: renewable energy sources, wind power, forecasting, data mining, big data, artificial intelligence, energy economics, power trading, power grids

Procedia PDF Downloads 518
23390 A Practice Model for Quality Improvement in Concrete Block Mini Plants Based on Merapi Volcanic Sand

Authors: Setya Winarno

Abstract:

Due to abundant Merapi volcanic sand in Yogyakarta City, many local people have utilized it for mass production of concrete blocks through mini plants although their products are low in quality. This paper presents a practice model for quality improvement in this situation in order to supply the current customer interest in good quality of construction material. The method of this research was to investigate a techno economic evaluation through laboratory test and interview. Samples of twenty existing concrete blocks made by local people had only 19.4 kg/cm2 in average compression strength which was lower than the minimum Indonesian standard of 25 kg/cm2. Through repeat testing in laboratory for fulfilling the standard, the concrete mix design of water cement ratio should not be more than 0.64 by weight basis. The proportion of sand as aggregate content should not be more than 9 parts to 1 part by volume of Portland cement. Considering the production cost, the basic price was Rp 1,820 for each concrete block, comparing to Rp 2,000 as a normal competitive market price. At last, the model describes (a) maximum water cement ratio is 0.64, (b) maximum proportion of sand and cement is 1:9, (c) the basic price is about Rp. 1,820.00 and (d) strategies to win the competitive market on mass production of concrete blocks are focus in quality, building relationships with consumer, rapid respond to customer need, continuous innovation by product diversification, promotion in social media, and strict financial management.

Keywords: concrete block, good quality, improvement model, diversification

Procedia PDF Downloads 515
23389 Communication of Sensors in Clustering for Wireless Sensor Networks

Authors: Kashish Sareen, Jatinder Singh Bal

Abstract:

The use of wireless sensor networks (WSNs) has grown vastly in the last era, pointing out the crucial need for scalable and energy-efficient routing and data gathering and aggregation protocols in corresponding large-scale environments. Wireless Sensor Networks have now recently emerged as a most important computing platform and continue to grow in diverse areas to provide new opportunities for networking and services. However, the energy constrained and limited computing resources of the sensor nodes present major challenges in gathering data. The sensors collect data about their surrounding and forward it to a command centre through a base station. The past few years have witnessed increased interest in the potential use of wireless sensor networks (WSNs) as they are very useful in target detecting and other applications. However, hierarchical clustering protocols have maximum been used in to overall system lifetime, scalability and energy efficiency. In this paper, the state of the art in corresponding hierarchical clustering approaches for large-scale WSN environments is shown.

Keywords: clustering, DLCC, MLCC, wireless sensor networks

Procedia PDF Downloads 482
23388 Democratic Political Culture of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok

Authors: Vilasinee Jintalikhitdee, Phusit Phukamchanoad, Sakapas Saengchai

Abstract:

This research aims to study the level of democratic political culture and the factors that affect the democratic political culture of 5th and 6th graders under the authority of Dusit District Office, Bangkok by using stratified sampling for probability sampling and using purposive sampling for non-probability sampling to collect data toward the distribution of questionnaires to 300 respondents. This covers all of the schools under the authority of Dusit District Office. The researcher analyzed the data by using descriptive statistics which include arithmetic mean, standard deviation, and inferential statistics which are Independent Samples T-test (T-test) and One-Way ANOVA (F-test). The researcher also collected data by interviewing the target groups, and then analyzed the data by the use of descriptive analysis. The result shows that 5th and 6th graders under the authority of Dusit District Office, Bangkok have exposed to democratic political culture at high level in overall. When considering each part, it found out that the part that has highest mean is “the constitutional democratic governmental system is suitable for Thailand” statement. The part with the lowest mean is “corruption (cheat and defraud) is normal in Thai society” statement. The factor that affects democratic political culture is grade levels, occupations of mothers, and attention in news and political movements.

Keywords: democratic, political culture, political movements, democratic governmental system

Procedia PDF Downloads 266
23387 Effectiveness of Catalysis in Ozonation for the Removal of Herbizide 2,4 Dichlorophenoxyacetic Acid from Contaminated Water

Authors: S. Shanthi

Abstract:

Catalyzed oxidation processes show extraordinary guarantee for application in numerous wastewater treatment ranges. Advanced oxidation processes are emerging innovation that might be utilized for particular objectives in wastewater treatment. This research work provides a solution for removal a refractory organic compound 2,4-dichlorophenoxyaceticacid a common water pollutant. All studies were done in batch mode in a constantly stirred reactor. Alternative ozonation processes catalysed by transition metals or granular activated carbon have been investigated for degradation of organics. Catalytic ozonation under study are homogeneous catalytic ozonation, which is based on ozone activation by transition metal ions present in aqueous solution, and secondly as heterogeneous catalytic ozonation in the presence of Granular Activated Carbon (GAC). The present studies reveal that heterogeneous catalytic ozonation using GAC favour the ozonation of 2,4-dichlorophenoxyaceticacid by increasing the rate of ozonation and a much higher degradation of substrates were obtained in a given time. Be that it may, Fe2+and Fe3+ ions decreased the rate of degradation of 2,4-dichlorophenoxyaceticacid indicating that it acts as a negative catalyst. In case of heterogeneous catalytic ozonation using GAC catalyst it was found that during the initial 5 minutes of contact solution concentration decreased significantly as the pollutants were adsorbed initially. Thereafter the substrate started getting oxidized and ozonation became a dominates the treatment process. The exhausted GAC was found to be regenerated in situ. The percentage reduction of the substrate was maximum achieved in minimum possible time when GAC catalyst is employed.

Keywords: ozonation, homogeneous catalysis, heterogeneous catalysis, granular activated carbon

Procedia PDF Downloads 250
23386 Saudi Human Awareness Needs: A Survey in How Human Causes Errors and Mistakes Leads to Leak Confidential Data with Proposed Solutions in Saudi Arabia

Authors: Amal Hussain Alkhaiwani, Ghadah Abdullah Almalki

Abstract:

Recently human errors have increasingly become a very high factor in security breaches that may affect confidential data, and most of the cyber data breaches are caused by human errors. With one individual mistake, the attacker will gain access to the entire network and bypass the implemented access controls without any immediate detection. Unaware employees will be vulnerable to any social engineering cyber-attacks. Providing security awareness to People is part of the company protection process; the cyber risks cannot be reduced by just implementing technology; the human awareness of security will significantly reduce the risks, which encourage changes in staff cyber-awareness. In this paper, we will focus on Human Awareness, human needs to continue the required security education level; we will review human errors and introduce a proposed solution to avoid the breach from occurring again. Recently Saudi Arabia faced many attacks with different methods of social engineering. As Saudi Arabia has become a target to many countries and individuals, we needed to initiate a defense mechanism that begins with awareness to keep our privacy and protect the confidential data against possible intended attacks.

Keywords: cybersecurity, human aspects, human errors, human mistakes, security awareness, Saudi Arabia, security program, security education, social engineering

Procedia PDF Downloads 160
23385 The Quality of Food and Drink Product Labels Translation from Indonesian into English

Authors: Rudi Hartono, Bambang Purwanto

Abstract:

The translation quality of food and drink labels from Indonesian into English is poor because the translation is not accurate, less natural, and difficult to read. The label translation can be found in some cans packages of food and drink products produced and marketed by several companies in Indonesia. If this problem is left unchecked, it will lead to a misunderstanding on the translation results and make consumers confused. This study was conducted to analyze the translation errors on food and drink products labels and formulate the solution for the better translation quality. The research design was the evaluation research with a holistic criticism approach. The data used were words, phrases, and sentences translated from Indonesian to English language printed on food and drink product labels. The data were processed by using Interactive Model Analysis that carried out three main steps: collecting, classifying, and verifying data. Furthermore, the data were analyzed by using content analysis to view the accuracy, naturalness, and readability of translation. The results showed that the translation quality of food and drink product labels from Indonesian to English has the level of accuracy (60%), level of naturalness (50%), and level readability (60%). This fact needs a help to create an effective strategy for translating food and drink product labels later.

Keywords: translation quality, food and drink product labels, a holistic criticism approach, interactive model, content analysis

Procedia PDF Downloads 374
23384 Influence of Counterface and Environmental Conditions on the Lubricity of Multilayer Graphene Coatings Produced on Nickel by Chemical Vapour Deposition

Authors: Iram Zahra

Abstract:

Friction and wear properties of multilayer graphene coatings (MLG) on nickel substrate were investigated at the macroscale, and different failure mechanisms working at the interface of nickel-graphene coatings were evaluated. Multilayer graphene coatings were produced on a nickel substrate using the atmospheric chemical vapour deposition (CVD) technique. Wear tests were performed on the pin-on-disk tribometer apparatus under dry air conditions, and using the saltwater solution, distilled water, and mineral oil lubricants and counterparts used in these wear tests were fabricated of stainless steel, chromium, and silicon nitride. The wear test parameters such as rotational speed, wear track diameter, temperature, relative humidity, and load were 60 rpm, 6 mm, 22˚C, 45%, and 2N, respectively. To analyse the friction and wear behaviour, coefficient of friction (COF) vs time curves were plotted, and the sliding surfaces of the samples and counterparts were examined using the optical microscope. Results indicated that graphene-coated nickel in mineral oil lubrication and dry conditions gave the minimum average value of COP (0.05) and wear track width ( ̴151 µm) against the three different types of counterparts. In contrast, uncoated nickel samples indicated a maximum wear track width ( ̴411 µm) and COF (0.5). Thorough investigation and analysis concluded that graphene-coated samples have two times lower COF and three times lower wear than the bare nickel samples. Furthermore, mechanical failures were significantly lower in the case of graphene-coated nickel. The overall findings suggested that multilayer graphene coatings have drastically decreased wear and friction on nickel substrate at the macroscale under various lubricating conditions and against different counterparts.

Keywords: friction, lubricity, multilayer graphene, sliding, wear

Procedia PDF Downloads 140
23383 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 101
23382 A Patent Trend Analysis for Hydrogen Based Ironmaking: Identifying the Technology’s Development Phase

Authors: Ebru Kaymaz, Aslı İlbay Hamamcı, Yakup Enes Garip, Samet Ay

Abstract:

The use of hydrogen as a fuel is important for decreasing carbon emissions. For the steel industry, reducing carbon emissions is one of the most important agendas of recent times globally. Because of the Paris Agreement requirements, European steel industry studies on green steel production. Although many literature reviews have analyzed this topic from technological and hydrogen based ironmaking, there are very few studies focused on patents of decarbonize parts of the steel industry. Hence, this study focus on technological progress of hydrogen based ironmaking and on understanding the main trends through patent data. All available patent data were collected from Questel Orbit. The trend analysis of more than 900 patent documents has been carried out by using Questel Orbit Intellixir to analyze a large number of data for scientific intelligence.

Keywords: hydrogen based ironmaking, DRI, direct reduction, carbon emission, steelmaking, patent analysis

Procedia PDF Downloads 145
23381 A Study on the Importance and Contributions of Transforming from OEM to ODM in Malaysian Furniture Industry

Authors: Nurul Ain Haron, Saiful Hazmi Bachek, Hafez Zainudin

Abstract:

This study is aimed to establish the importance and contribution of Original Design Manufacturing (ODM) in Malaysian Furniture Industry and to close the gap between the players in the industry. The study confirms that today’s furniture industry favor Original Equipment Manufacturing (OEM) basis. Thus, resulting in the local industry lacking the expertise of designing furniture to a state of no contest. This study method used consists of literature reviews, observation, questionnaire and sessions of interviews. The result shows that the public has from minimum to almost no knowledge of the term Original Design Manufacturing (ODM) concept and the impact towards our current future industry. The manufacturers however, prefers Original Equipment Manufacturing (OEM) concept due to its easy and fast investment returns with the need of product designing process, while the interviews carried out with the authorized council had some convincing urges of doing their part promoting the awareness through trainings and seminars. Findings show that, in the rush of archiving ODM status needs extensive cooperation from many parties that are authorized council, furniture manufacturers, designers and also the public perceptions of labeling local made goods as the black goat. The current mind set of OEM manufacturers need to be change for industry’s future. As Malaysia’s living status constantly increases, so will the demands of a better salary. If these current issues are not resolved, OEM international buyers will definitely have to shift to some other lower cost manufacturer like China or Taiwan. When vendors stopped to order, today’s OEM manufacturers will no longer exist in the future.

Keywords: design manufacturing, furniture design, original design manufacturing, original equipment manufacturing

Procedia PDF Downloads 445
23380 Solving the Economic Load Dispatch Problem Using Differential Evolution

Authors: Alaa Sheta

Abstract:

Economic Load Dispatch (ELD) is one of the vital optimization problems in power system planning. Solving the ELD problems mean finding the best mixture of power unit outputs of all members of the power system network such that the total fuel cost is minimized while sustaining operation requirements limits satisfied across the entire dispatch phases. Many optimization techniques were proposed to solve this problem. A famous one is the Quadratic Programming (QP). QP is a very simple and fast method but it still suffer many problem as gradient methods that might trapped at local minimum solutions and cannot handle complex nonlinear functions. Numbers of metaheuristic algorithms were used to solve this problem such as Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). In this paper, another meta-heuristic search algorithm named Differential Evolution (DE) is used to solve the ELD problem in power systems planning. The practicality of the proposed DE based algorithm is verified for three and six power generator system test cases. The gained results are compared to existing results based on QP, GAs and PSO. The developed results show that differential evolution is superior in obtaining a combination of power loads that fulfill the problem constraints and minimize the total fuel cost. DE found to be fast in converging to the optimal power generation loads and capable of handling the non-linearity of ELD problem. The proposed DE solution is able to minimize the cost of generated power, minimize the total power loss in the transmission and maximize the reliability of the power provided to the customers.

Keywords: economic load dispatch, power systems, optimization, differential evolution

Procedia PDF Downloads 282
23379 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data

Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill

Abstract:

Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.

Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function

Procedia PDF Downloads 279
23378 Evaluation of Symptoms, Laboratory Findings, and Natural History of IgE Mediated Wheat Allergy

Authors: Soudeh Tabashi, Soudabeh Fazeli Dehkordy, Masood Movahedi, Nasrin Behniafard

Abstract:

Introduction: Food allergy has increased in three last decades. Since wheat is one of the major constituents of daily meal in many regions throughout the world, wheat allergy is one of the most important allergies ranking among the 8 most common types of food allergies. Our information about epidemiology and etiology of food allergies are limited. Therefore, in this study we sought to evaluate the symptoms and laboratory findings in children with wheat allergy. Materials and methods: There were 23 patients aged up to 18 with the diagnosis of IgE mediated wheat allergy that were included enrolled in this study. Using a questionnaire .we collected their information and organized them into 4 groups categories of: demographic data identification, signs and symptoms, comorbidities, and laboratory data. Then patients were followed up for 6 month and their lab data were compared together. Results: Most of the patients (82%) presented the symptoms of wheat allergy in the first year of their life. The skin and the respiratory system were the most commonly involved organs with an incidence of 86% and 78% respectively. Most of the patients with wheat allergy were also sensitive to the other type of foods and their sensitivity to egg were most common type (47%). in 57% of patients, IgE levels were decreased during the 6 month follow-up period. Conclusion: We do not have enough information about data on epidemiology and response to therapy of wheat allergy and to best of our knowledge no study has addressed this issue in Iran so far. This study is the first source of information about IgE mediated wheat allergy in Iran and It can provide an opening for future studies about wheat allergy and its treatments.

Keywords: wheat allergy, food allergy, IgE, food allergy

Procedia PDF Downloads 194
23377 Development of a Low-Cost Smart Insole for Gait Analysis

Authors: S. M. Khairul Halim, Mojtaba Ghodsi, Morteza Mohammadzaheri

Abstract:

Gait analysis is essential for diagnosing musculoskeletal and neurological conditions. However, current methods are often complex and expensive. This paper introduces a methodology for analysing gait parameters using a smart insole with a built-in accelerometer. The system measures stance time, swing time, step count, and cadence and wirelessly transmits data to a user-friendly IoT dashboard for centralized processing. This setup enables remote monitoring and advanced data analytics, making it a versatile tool for medical diagnostics and everyday usage. Integration with IoT enhances the portability and connectivity of the device, allowing for secure, encrypted data access over the Internet. This feature supports telemedicine and enables personalized treatment plans tailored to individual needs. Overall, the approach provides a cost-effective (almost 25 GBP), accurate, and user-friendly solution for gait analysis, facilitating remote tracking and customized therapy.

Keywords: gait analysis, IoT, smart insole, accelerometer sensor

Procedia PDF Downloads 17
23376 A Geospatial Analysis of Diminishing Himalayan Ice Under Influence of Anthropomorphism: A Case Study of Himalayan Ice From 1990 to 2020 in Pakistan

Authors: Ali Akber Khan

Abstract:

In the 21st century, freshwater resources, especially ice cover, would have grave significance as ice carries most of the total freshwater resources in the world. The Himalayas in Pakistan is one of the biggest sources of fresh water for Pakistan. These regions of the Himalayas and neighboring mountains include Swat, Chitral, Upper Dir, Lower Dir, Mardan, Swabi, Haripur, Abbottabad, Muzaffarabad, Neelum, and Mansehra in Pakistan. The study examines ice resources in the years 1990 to 2020 and shows a decrease in snow-shrouded regions, reducing from 72,187.54 sq. km in 1990 to 66,061.17 sq. km in 2020. This indicates a total ice cover loss of 6,126.37 sq. km area in 40 years due to environmental variabilities and climatic changes. From 2010 to 2020 loss of ice-covered area was 3479.24 sq. km. The mean maximum temperature from 2000 to 2010 in December, January and February is 7.4 °C, 4.2 °Cand 7.8 °C respectively, while from 2011 to 2022 mean maximum temperature registered in December, January and February is 6.9°C, 4.1°C and 6.6 °C respectively. Investigation of anthropogenic elements in the region shows population rise. From investigation, 22 cities and towns of the Himalayas region and neighboring mountains showed the highest rise in population, 329.46%, and a minimum rise of 14.39%, while 12 towns have risen in population by more than 100% from 1990 to 2023. This examination adds to a point-by-point comprehension of the connections among normal variables, population dynamics, snow cover variation, evidence strategies, and multipurpose measures for maintained and strong improvement in the districts.

Keywords: snow, ice, Himalayas, Pakistan, climate change, population

Procedia PDF Downloads 47
23375 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 449
23374 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients

Authors: Soha A. Bahanshal, Byung G. Kim

Abstract:

Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.

Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission

Procedia PDF Downloads 186
23373 The Comparison of Joint Simulation and Estimation Methods for the Geometallurgical Modeling

Authors: Farzaneh Khorram

Abstract:

This paper endeavors to construct a block model to assess grinding energy consumption (CCE) and pinpoint blocks with the highest potential for energy usage during the grinding process within a specified region. Leveraging geostatistical techniques, particularly joint estimation, or simulation, based on geometallurgical data from various mineral processing stages, our objective is to forecast CCE across the study area. The dataset encompasses variables obtained from 2754 drill samples and a block model comprising 4680 blocks. The initial analysis encompassed exploratory data examination, variography, multivariate analysis, and the delineation of geological and structural units. Subsequent analysis involved the assessment of contacts between these units and the estimation of CCE via cokriging, considering its correlation with SPI. The selection of blocks exhibiting maximum CCE holds paramount importance for cost estimation, production planning, and risk mitigation. The study conducted exploratory data analysis on lithology, rock type, and failure variables, revealing seamless boundaries between geometallurgical units. Simulation methods, such as Plurigaussian and Turning band, demonstrated more realistic outcomes compared to cokriging, owing to the inherent characteristics of geometallurgical data and the limitations of kriging methods.

Keywords: geometallurgy, multivariate analysis, plurigaussian, turning band method, cokriging

Procedia PDF Downloads 70
23372 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach

Authors: Eric Mxolisi Mkhondo

Abstract:

Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.

Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism

Procedia PDF Downloads 165
23371 Deriving Generic Transformation Matrices for Multi-Axis Milling Machine

Authors: Alan C. Lin, Tzu-Kuan Lin, Tsong Der Lin

Abstract:

This paper proposes a new method to find the equations of transformation matrix for the rotation angles of the two rotational axes and the coordinates of the three linear axes of an orthogonal multi-axis milling machine. This approach provides intuitive physical meanings for rotation angles of multi-axis machines, which can be used to evaluate the accuracy of the conversion from CL data to NC data.

Keywords: CAM, multi-axis milling machining, transformation matrix, rotation angles

Procedia PDF Downloads 482
23370 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models

Authors: Manisha Mukherjee, Diptarka Saha

Abstract:

Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.

Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function

Procedia PDF Downloads 166
23369 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty

Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos

Abstract:

Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.

Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning

Procedia PDF Downloads 210
23368 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.

Keywords: big data, evolutionary computing, cloud, precision technologies

Procedia PDF Downloads 189
23367 Hydrothermal Energy Application Technology Using Dam Deep Water

Authors: Yooseo Pang, Jongwoong Choi, Yong Cho, Yongchae Jeong

Abstract:

Climate crisis, such as environmental problems related to energy supply, is getting emerged issues, so the use of renewable energy is essentially required to solve these problems, which are mainly managed by the Paris Agreement, the international treaty on climate change. The government of the Republic of Korea announced that the key long-term goal for a low-carbon strategy is “Carbon neutrality by 2050”. It is focused on the role of the internet data centers (IDC) in which large amounts of data, such as artificial intelligence (AI) and big data as an impact of the 4th industrial revolution, are managed. The demand for the cooling system market for IDC was about 9 billion US dollars in 2020, and 15.6% growth a year is expected in Korea. It is important to control the temperature in IDC with an efficient air conditioning system, so hydrothermal energy is one of the best options for saving energy in the cooling system. In order to save energy and optimize the operating conditions, it has been considered to apply ‘the dam deep water air conditioning system. Deep water at a specific level from the dam can supply constant water temperature year-round. It will be tested & analyzed the amount of energy saving with a pilot plant that has 100RT cooling capacity. Also, a target of this project is 1.2 PUE (Power Usage Effectiveness) which is the key parameter to check the efficiency of the cooling system.

Keywords: hydrothermal energy, HVAC, internet data center, free-cooling

Procedia PDF Downloads 81
23366 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam

Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen

Abstract:

In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.

Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks

Procedia PDF Downloads 210
23365 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets

Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe

Abstract:

Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.

Keywords: biomedical research, genomics, information systems, software

Procedia PDF Downloads 270