Search results for: stream computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1589

Search results for: stream computing

1229 D-Wave Quantum Computing Ising Model: A Case Study for Forecasting of Heat Waves

Authors: Dmytro Zubov, Francesco Volponi

Abstract:

In this paper, D-Wave quantum computing Ising model is used for the forecasting of positive extremes of daily mean air temperature. Forecast models are designed with two to five qubits, which represent 2-, 3-, 4-, and 5-day historical data respectively. Ising model’s real-valued weights and dimensionless coefficients are calculated using daily mean air temperatures from 119 places around the world, as well as sea level (Aburatsu, Japan). In comparison with current methods, this approach is better suited to predict heat wave values because it does not require the estimation of a probability distribution from scarce observations. Proposed forecast quantum computing algorithm is simulated based on traditional computer architecture and combinatorial optimization of Ising model parameters for the Ronald Reagan Washington National Airport dataset with 1-day lead-time on learning sample (1975-2010 yr). Analysis of the forecast accuracy (ratio of successful predictions to total number of predictions) on the validation sample (2011-2014 yr) shows that Ising model with three qubits has 100 % accuracy, which is quite significant as compared to other methods. However, number of identified heat waves is small (only one out of nineteen in this case). Other models with 2, 4, and 5 qubits have 20 %, 3.8 %, and 3.8 % accuracy respectively. Presented three-qubit forecast model is applied for prediction of heat waves at other five locations: Aurel Vlaicu, Romania – accuracy is 28.6 %; Bratislava, Slovakia – accuracy is 21.7 %; Brussels, Belgium – accuracy is 33.3 %; Sofia, Bulgaria – accuracy is 50 %; Akhisar, Turkey – accuracy is 21.4 %. These predictions are not ideal, but not zeros. They can be used independently or together with other predictions generated by different method(s). The loss of human life, as well as environmental, economic, and material damage, from extreme air temperatures could be reduced if some of heat waves are predicted. Even a small success rate implies a large socio-economic benefit.

Keywords: heat wave, D-wave, forecast, Ising model, quantum computing

Procedia PDF Downloads 476
1228 Impact of Environmental Changes on Blood Parameters in the Pelophylax ridibundus

Authors: Murat Tosunoglu, Cigdem Gul, Nurcihan Hacioglu, Nurdan Tepeova

Abstract:

Amphibian and Reptilian species are influenced by pollution and habitat destruction. Blood parameters of Amphibia species were particularly affected by the negative environmental conditions. Studied frog samples 36 clinically normal Pelophylax ridibundus individuals were captured along the Biga Stream between April–June 2014. When comparing our findings with the Turkish legislation (Water pollution control regulation), the 1. Locality of the Biga stream in terms of total coliform classified as "high quality water" (Coliform: 866.66 MPN/100 mL), while the 2. locality was a "contaminated water" (Coliform: 53266.66 MPN/100 mL). Blood samples of the live specimens were obtained in the laboratory within one day of their capture. The blood samples were taken from the etherized frogs by means of ventriculus punctures, via heparinized hematocrit capillaries. Hematological and biochemical analyses based on high quality water and contaminated water, respectively, are as follows: Red blood cell count (444210.52-426846.15 per cubic millimeter of blood), white blood cell count (4215.78-4684.61 per cubic millimeter of blood), hematocrit value (29.25-29.43 %), hemoglobin concentration (7.76-7.22 g/dl), mean corpuscular volume (637.64-719.99 fl), mean corpuscular hemoglobin (184.78-174.75 pg), mean corpuscular hemoglobin concentration (29.44-24.82 %), glucose (103.74-124.13 mg/dl), urea (87.68-81.72 mg/L), cholesterol (148.20-197.39 mg/dl), creatinine (0.29-0.28 mg/dl), uric acid (10.26-7.55 mg/L), albumin (1.13-1.39 g/dl), calcium (11.45-9.70 mg/dl), triglyceride (135.23-155.85 mg/dl), total protein (4.26-3.73 g/dl), phosphorus (6.83-17.86 mg/dl), and magnesium (0.95-1.06 mg/dl). The some hematological parameters in P. ridibundus specimens are given for the first time in this study. No water quality dependent variation was observed in clinic hematology parameters measured.

Keywords: Pelophylax ridibundus, hematological parameters, biochemistry, freshwater quality

Procedia PDF Downloads 347
1227 Audio Information Retrieval in Mobile Environment with Fast Audio Classifier

Authors: Bruno T. Gomes, José A. Menezes, Giordano Cabral

Abstract:

With the popularity of smartphones, mobile apps emerge to meet the diverse needs, however the resources at the disposal are limited, either by the hardware, due to the low computing power, or the software, that does not have the same robustness of desktop environment. For example, in automatic audio classification (AC) tasks, musical information retrieval (MIR) subarea, is required a fast processing and a good success rate. However the mobile platform has limited computing power and the best AC tools are only available for desktop. To solve these problems the fast classifier suits, to mobile environments, the most widespread MIR technologies, seeking a balance in terms of speed and robustness. At the end we found that it is possible to enjoy the best of MIR for mobile environments. This paper presents the results obtained and the difficulties encountered.

Keywords: audio classification, audio extraction, environment mobile, musical information retrieval

Procedia PDF Downloads 515
1226 A Novel Approach to Design and Implement Context Aware Mobile Phone

Authors: G. S. Thyagaraju, U. P. Kulkarni

Abstract:

Context-aware computing refers to a general class of computing systems that can sense their physical environment, and adapt their behaviour accordingly. Context aware computing makes systems aware of situations of interest, enhances services to users, automates systems and personalizes applications. Context-aware services have been introduced into mobile devices, such as PDA and mobile phones. In this paper we are presenting a novel approaches used to realize the context aware mobile. The context aware mobile phone (CAMP) proposed in this paper senses the users situation automatically and provides user context required services. The proposed system is developed by using artificial intelligence techniques like Bayesian Network, fuzzy logic and rough sets theory based decision table. Bayesian Network to classify the incoming call (high priority call, low priority call and unknown calls), fuzzy linguistic variables and membership degrees to define the context situations, the decision table based rules for service recommendation. To exemplify and demonstrate the effectiveness of the proposed methods, the context aware mobile phone is tested for college campus scenario including different locations like library, class room, meeting room, administrative building and college canteen.

Keywords: context aware mobile, fuzzy logic, decision table, Bayesian probability

Procedia PDF Downloads 343
1225 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 141
1224 Towards a Resources Provisioning for Dynamic Workflows in the Cloud

Authors: Fairouz Fakhfakh, Hatem Hadj Kacem, Ahmed Hadj Kacem

Abstract:

Cloud computing offers a new model of service provisioning for workflow applications, thanks to its elasticity and its paying model. However, it presents various challenges that need to be addressed in order to be efficiently utilized. The resources provisioning problem for workflow applications has been widely studied. Nevertheless, the existing works did not consider the change in workflow instances while they are being executed. This functionality has become a major requirement to deal with unusual situations and evolution. This paper presents a first step towards the resources provisioning for a dynamic workflow. In fact, we propose a provisioning algorithm which minimizes the overall workflow execution cost, while meeting a deadline constraint. Then, we extend it to support the dynamic adding of tasks. Experimental results show that our proposed heuristic demonstrates a significant reduction in resources cost by using a consolidation process.

Keywords: cloud computing, resources provisioning, dynamic workflow, workflow applications

Procedia PDF Downloads 261
1223 Inclusion and Changes of a Research Criterion in the Institute for Quality and Accreditation of Computing, Engineering and Technology Accreditation Model

Authors: J. Daniel Sanchez Ruiz

Abstract:

The paper explains why and how a research criterion was included within an accreditation system for undergraduate engineering programs, in spite of not being a common practice of accreditation agencies at a global level. This paper is divided into three parts. The first presents the context and the motivations that led the Institute for Quality and Accreditation of Computing, Engineering and Technology Programs (ICACIT) to add a research criterion. The second describes the criterion adopted and the feedback received during 2017 accreditation cycle. The third, the author proposes changes to the accreditation criteria that respond in a pertinent way to the results-based accreditation model and the national context. The author seeks to reconcile an outcome based accreditation model, aligned with the established by the International Engineering Alliance, with the particular context of higher education in Peru.

Keywords: accreditation, engineering education, quality assurance, research

Procedia PDF Downloads 265
1222 Optimization of Topology-Aware Job Allocation on a High-Performance Computing Cluster by Neural Simulated Annealing

Authors: Zekang Lan, Yan Xu, Yingkun Huang, Dian Huang, Shengzhong Feng

Abstract:

Jobs on high-performance computing (HPC) clusters can suffer significant performance degradation due to inter-job network interference. Topology-aware job allocation problem (TJAP) is such a problem that decides how to dedicate nodes to specific applications to mitigate inter-job network interference. In this paper, we study the window-based TJAP on a fat-tree network aiming at minimizing the cost of communication hop, a defined inter-job interference metric. The window-based approach for scheduling repeats periodically, taking the jobs in the queue and solving an assignment problem that maps jobs to the available nodes. Two special allocation strategies are considered, i.e., static continuity assignment strategy (SCAS) and dynamic continuity assignment strategy (DCAS). For the SCAS, a 0-1 integer programming is developed. For the DCAS, an approach called neural simulated algorithm (NSA), which is an extension to simulated algorithm (SA) that learns a repair operator and employs them in a guided heuristic search, is proposed. The efficacy of NSA is demonstrated with a computational study against SA and SCIP. The results of numerical experiments indicate that both the model and algorithm proposed in this paper are effective.

Keywords: high-performance computing, job allocation, neural simulated annealing, topology-aware

Procedia PDF Downloads 83
1221 Optimization of a Bioremediation Strategy for an Urban Stream of Matanza-Riachuelo Basin

Authors: María D. Groppa, Andrea Trentini, Myriam Zawoznik, Roxana Bigi, Carlos Nadra, Patricia L. Marconi

Abstract:

In the present work, a remediation bioprocess based on the use of a local isolate of the microalgae Chlorella vulgaris immobilized in alginate beads is proposed. This process was shown to be effective for the reduction of several chemical and microbial contaminants present in Cildáñez stream, a water course that is part of the Matanza-Riachuelo Basin (Buenos Aires, Argentina). The bioprocess, involving the culture of the microalga in autotrophic conditions in a stirred-tank bioreactor supplied with a marine propeller for 6 days, allowed a significant reduction of Escherichia coli and total coliform numbers (over 95%), as well as of ammoniacal nitrogen (96%), nitrates (86%), nitrites (98%), and total phosphorus (53%) contents. Pb content was also significantly diminished after the bioprocess (95%). Standardized cytotoxicity tests using Allium cepa seeds and Cildáñez water pre- and post-remediation were also performed. Germination rate and mitotic index of onion seeds imbibed in Cildáñez water subjected to the bioprocess was similar to that observed in seeds imbibed in distilled water and significantly superior to that registered when untreated Cildáñez water was used for imbibition. Our results demonstrate the potential of this simple and cost-effective technology to remove urban-water contaminants, offering as an additional advantage the possibility of an easy biomass recovery, which may become a source of alternative energy.

Keywords: bioreactor, bioremediation, Chlorella vulgaris, Matanza-Riachuelo Basin, microalgae

Procedia PDF Downloads 221
1220 Enabling UDP Multicast in Cloud IaaS: An Enterprise Use Case

Authors: Patrick J. Kerpan, Ryan C. Koop, Margaret M. Walker, Chris P. Swan

Abstract:

The User Datagram Protocol (UDP) multicast is a vital part of data center networking that is being left out of major cloud computing providers' network infrastructure. Enterprise users rely on multicast, and particularly UDP multicast to create and connect vital business operations. For example, UPD makes a variety of business functions possible from simultaneous content media updates, High-Performance Computing (HPC) grids, and video call routing for massive open online courses (MOOCs). Essentially, UDP multicast's technological slight is causing a huge effect on whether companies choose to use (or not to use) public cloud infrastructure as a service (IaaS). Allowing the ‘chatty’ UDP multicast protocol inside a cloud network could have a serious impact on the performance of the cloud as a whole. Cloud IaaS providers solve the issue by disallowing all UDP multicast. But what about enterprise use cases for multicast applications in organizations that want to move to the cloud? To re-allow multicast traffic, enterprises can build a layer 3 - 7 network over the top of a data center, private cloud, or public cloud. An overlay network simply creates a private, sealed network on top of the existing network. Overlays give complete control of the network back to enterprise cloud users the freedom to manage their network beyond the control of the cloud provider’s firewall conditions. The same logic applies if for users who wish to use IPsec or BGP network protocols inside or connected into an overlay network in cloud IaaS.

Keywords: cloud computing, protocols, UDP multicast, virtualization

Procedia PDF Downloads 569
1219 Effects of Watershed Erosion on Stream Channel Formation

Authors: Tiao Chang, Ivan Caballero, Hong Zhou

Abstract:

Streams carry water and sediment naturally by maintaining channel dimensions, pattern, and profile over time. Watershed erosion as a natural process has occurred to contribute sediment to streams over time. The formation of channel dimensions is complex. This study is to relate quantifiable and consistent channel dimensions at the bankfull stage to the corresponding watershed erosion estimation by the Revised Universal Soil Loss Equation (RUSLE). Twelve sites of which drainage areas range from 7 to 100 square miles in the Hocking River Basin of Ohio were selected for the bankfull geometry determinations including width, depth, cross-section area, bed slope, and drainage area. The twelve sub-watersheds were chosen to obtain a good overall representation of the Hocking River Basin. It is of interest to determine how these bankfull channel dimensions are related to the soil erosion of corresponding sub-watersheds. Soil erosion is a natural process that has occurred in a watershed over time. The RUSLE was applied to estimate erosions of the twelve selected sub-watersheds where the bankfull geometry measurements were conducted. These quantified erosions of sub-watersheds are used to investigate correlations with bankfull channel dimensions including discharge, channel width, channel depth, cross-sectional area, and pebble distribution. It is found that drainage area, bankfull discharge and cross-sectional area correlates strongly with watershed erosion well. Furthermore, bankfull width and depth are moderately correlated with watershed erosion while the particle size, D50, of channel bed sediment is not well correlated with watershed erosion.

Keywords: watershed, stream, sediment, channel

Procedia PDF Downloads 266
1218 Study of Ageing in the Marine Environment of Bonded Composite Structures by Ultrasonic Guided Waves. Comparison of the Case of a Conventional Carbon-epoxy Composite and a Recyclable Resin-Based Composite

Authors: Hamza Hafidi Alaoui, Damien Leduc, Mounsif Ech Cherif El Kettani

Abstract:

This study is dedicated to the evaluation of the ageing of turbine blades in sea conditions, based on ultrasonic Non Destructive Testing (NDT) methods. This study is being developed within the framework of the European Interreg TIGER project. The Tidal Stream Industry Energiser Project, known as TIGER, is the biggest ever Interreg project driving collaboration and cost reductionthrough tidal turbine installations in the UK and France. The TIGER project will drive the growth of tidal stream energy to become a greater part of the energy mix, with significant benefits for coastal communities. In the bay of Paimpol-Bréhat (Brittany), different samples of composite material and bonded composite/composite structures have been immersed at the same time near a turbine. The studied samples are either conventional carbon-epoxy composite samples or composite samples based on a recyclable resin (called recyclamine). One of the objectives of the study is to compare the ageing of the two types of structure. A sample of each structure is picked up every 3 to 6 months and analyzed using ultrasonic guided waves and bulk waves and compared to reference samples. In order to classify the damage level as a function of time spent under the sea, the measure have been compared to a rheological model based on the Finite Elements Method (FEM). Ageing of the composite material, as well as that of the adhesive, is identified. The aim is to improve the quality of the turbine blade structure in terms of longevity and reduced maintenance needs.

Keywords: non-destructive testing, ultrasound, composites, guides waves

Procedia PDF Downloads 197
1217 Effects of Large Woody Debris on the Abundance and Diversity of Freshwater Invertebrates and Vertebrates

Authors: M. J. Matulino, Carissa Ganong, Mark Mills, Jazmine Harry

Abstract:

Large Woody Debris (LWD), defined as wooden debris with a diameter of at least 10 cm and a length of 2 m, serves as a crucial resource and habitat for aquatic organisms. While research on the ecological impacts of LWD has been conducted in temperate streams, LWD's influence on tropical stream biodiversity remains understudied, making this investigation particularly valuable for future conservation efforts. The Sura River in La Selva Biological Station includes both LWD and open channel sites. We sampled paired LWD and open-channel sites using minnow traps, Promar traps, and dip nets. Vertebrates were identified as species, while macroinvertebrates were identified to order level. We quantified abundance, richness, and Shannon diversity at each. We captured a total of 467 individuals, including 2 turtles, 17 fishes, 1 freshwater crab, 39 shrimp, and 408 other macroinvertebrates. Total abundance was significantly higher in LWD sites. Species richness was marginally higher in LWD sites, but the Shannon diversity index did not differ significantly with habitat. Shrimp (Macrobrachium olfersi) length was significantly higher in LWD areas. Increased food resources and microhabitat availability could contribute to higher abundance, richness, and organismal size in LWD environments. This study fills a critical gap by investigating LWD effects in a tropical environment, providing valuable insights for conservation efforts and the preservation of aquatic biodiversity.

Keywords: large woody debris (LWD), aquatic organisms, ecological impacts, tropical stream biodiversity, conservation efforts

Procedia PDF Downloads 64
1216 A Review of Fractal Dimension Computing Methods Applied to Wear Particles

Authors: Manish Kumar Thakur, Subrata Kumar Ghosh

Abstract:

Various types of particles found in lubricant may be characterized by their fractal dimension. Some of the available methods are: yard-stick method or structured walk method, box-counting method. This paper presents a review of the developments and progress in fractal dimension computing methods as applied to characteristics the surface of wear particles. An overview of these methods, their implementation, their advantages and their limits is also present here. It has been accepted that wear particles contain major information about wear and friction of materials. Morphological analysis of wear particles from a lubricant is a very effective way for machine condition monitoring. Fractal dimension methods are used to characterize the morphology of the found particles. It is very useful in the analysis of complexity of irregular substance. The aim of this review is to bring together the fractal methods applicable for wear particles.

Keywords: fractal dimension, morphological analysis, wear, wear particles

Procedia PDF Downloads 460
1215 Web Service Architectural Style Selection in Multi-Criteria Requirements

Authors: Ahmad Mohsin, Syda Fatima, Falak Nawaz, Aman Ullah Khan

Abstract:

Selection of an appropriate architectural style is vital to the success of target web service under development. The nature of architecture design and selection for service-oriented computing applications is quite different as compared to traditional software. Web Services have complex and rigorous architectural styles to choose. Due to this, selection for accurate architectural style for web services development has become a more complex decision to be made by architects. Architectural style selection is a multi-criteria decision and demands lots of experience in service oriented computing. Decision support systems are good solutions to simplify the selection process of a particular architectural style. Our research suggests a new approach using DSS for selection of architectural styles while developing a web service to cater FRs and NFRs. Our proposed DSS helps architects to select right web service architectural pattern according to the domain and non-functional requirements. In this paper, a rule base DSS has been developed using CLIPS (C Language Integrated Production System) to support decisions using multi-criteria requirements. This DSS takes architectural characteristics, domain requirements and software architect preferences for NFRs as input for different architectural styles in use today in service-oriented computing. Weighted sum model has been applied to prioritize quality attributes and domain requirements. Scores are calculated using multiple criterions to choose the final architecture style.

Keywords: software architecture, web-service, rule-based, DSS, multi-criteria requirements, quality attributes

Procedia PDF Downloads 338
1214 Artificial Intelligent-Based Approaches for Task ‎Offloading, ‎Resource ‎Allocation and Service ‎Placement of ‎Internet of Things ‎Applications: State of the Art

Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib‎

Abstract:

In order to support the continued growth, critical latency of ‎IoT ‎applications, and ‎various obstacles of traditional data centers, ‎mobile edge ‎computing (MEC) has ‎emerged as a promising solution that extends cloud data-processing and decision-making to edge devices. ‎By adopting a MEC structure, IoT applications could be executed ‎locally, on ‎an edge server, different fog nodes, or distant cloud ‎data centers. However, we are ‎often ‎faced with wanting to optimize conflicting criteria such as ‎minimizing energy ‎consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge ‎devices and trying to ‎keep ‎high performance (reducing ‎response time, increasing throughput and service availability) ‎at the same ‎time‎. Achieving one goal may affect the other, making task offloading (TO), ‎resource allocation (RA), and service placement (SP) complex ‎processes. ‎It is a nontrivial multi-objective optimization ‎problem ‎to study the trade-off between conflicting criteria. ‎The paper provides a survey on different TO, SP, and RA recent multi-‎objective optimization (MOO) approaches used in edge computing environments, particularly artificial intelligent (AI) ones, to satisfy various objectives, constraints, and dynamic conditions related to IoT applications‎.

Keywords: mobile edge computing, multi-objective optimization, artificial ‎intelligence ‎approaches, task offloading, resource allocation, ‎ service placement

Procedia PDF Downloads 88
1213 Municipal Solid Waste (MSW) Composition and Generation in Nablus City, Palestine

Authors: Issam A. Al-Khatib

Abstract:

In order to achieve a significant reduction of waste amount flowing into landfills, it is important to first understand the composition of the solid municipal waste generated. Hence a detailed analysis of municipal solid waste composition has been conducted in Nablus city. The aim is to provide data on the potential recyclable fractions in the actual waste stream, with a focus on the plastic fraction. Hence, waste-sorting campaigns were conducted on mixed waste containers from five districts in Nablus city. The districts vary in terms of infrastructure and average income. The target is to obtain representative data about the potential quantity and quality of household plastic waste. The study has measured the composition of municipal solid waste collected/ transported by Nablus municipality. The analysis was done by categorizing the samples into eight primary fractions (organic and food waste, paper and cardboard, glass, metals, textiles, plastic, a fine fraction (<10 mm), and others). The study results reveal that the MSW stream in Nablus city has a significant bio- and organic waste fraction (about 68% of the total MSW). The second largest fraction is paper and cardboard (13.6%), followed by plastics (10.1%), textiles (3.2%), glass (1.9%), metals (1.8%), a fine fraction (0.5%), and other waste (0.3%). After this complete and detailed characterization of MSW collected in Nablus and taking into account the content of biodegradable organic matter, the composting could be a solution for the city of Nablus where the surrounding areas of Nablus city have agricultural activities and could be a natural outlet to the compost product. Different waste management options could be practiced in the future in addition to composting, such as energy recovery and recycling, which result in a greater possibility of reducing substantial amounts that are disposed of at landfills.

Keywords: developing countries, composition, management, recyclable, waste.

Procedia PDF Downloads 69
1212 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 152
1211 Parallel Random Number Generation for the Modern Supercomputer Architectures

Authors: Roman Snytsar

Abstract:

Pseudo-random numbers are often used in scientific computing such as the Monte Carlo Simulations or the Quantum Inspired Optimization. Requirements for a parallel random number generator running in the modern multi-core vector environment are more stringent than those for sequential random number generators. As well as passing the usual quality tests, the output of the parallel random number generator must be verifiable and reproducible throughout the concurrent execution. We propose a family of vectorized Permuted Congruential Generators. Implementations are available for multiple modern vector modern computer architectures. Besides demonstrating good single core performance, the generators scale easily across many processor cores and multiple distributed nodes. We provide performance and parallel speedup analysis and comparisons between the implementations.

Keywords: pseudo-random numbers, quantum optimization, SIMD, parallel computing

Procedia PDF Downloads 95
1210 A Machine Learning Based Framework for Education Levelling in Multicultural Countries: UAE as a Case Study

Authors: Shatha Ghareeb, Rawaa Al-Jumeily, Thar Baker

Abstract:

In Abu Dhabi, there are many different education curriculums where sector of private schools and quality assurance is supervising many private schools in Abu Dhabi for many nationalities. As there are many different education curriculums in Abu Dhabi to meet expats’ needs, there are different requirements for registration and success. In addition, there are different age groups for starting education in each curriculum. In fact, each curriculum has a different number of years, assessment techniques, reassessment rules, and exam boards. Currently, students that transfer curriculums are not being placed in the right year group due to different start and end dates of each academic year and their date of birth for each year group is different for each curriculum and as a result, we find students that are either younger or older for that year group which therefore creates gaps in their learning and performance. In addition, there is not a way of storing student data throughout their academic journey so that schools can track the student learning process. In this paper, we propose to develop a computational framework applicable in multicultural countries such as UAE in which multi-education systems are implemented. The ultimate goal is to use cloud and fog computing technology integrated with Artificial Intelligence techniques of Machine Learning to aid in a smooth transition when assigning students to their year groups, and provide leveling and differentiation information of students who relocate from a particular education curriculum to another, whilst also having the ability to store and access student data from anywhere throughout their academic journey.

Keywords: admissions, algorithms, cloud computing, differentiation, fog computing, levelling, machine learning

Procedia PDF Downloads 114
1209 The Results of Longitudinal Water Quality Monitoring of the Brandywine River, Chester County, Pennsylvania by High School Students

Authors: Dina L. DiSantis

Abstract:

Strengthening a sense of responsibility while relating global sustainability concepts such as water quality and pollution to a local water system can be achieved by teaching students to conduct and interpret water quality monitoring tests. When students conduct their own research, they become better stewards of the environment. Providing outdoor learning and place-based opportunities for students helps connect them to the natural world. By conducting stream studies and collecting data, students are able to better understand how the natural environment is a place where everything is connected. Students have been collecting physical, chemical and biological data along the West and East Branches of the Brandywine River, in Pennsylvania for over ten years. The stream studies are part of the advanced placement environmental science and aquatic science courses that are offered as electives to juniors and seniors at the Downingtown High School West Campus in Downingtown, Pennsylvania. Physical data collected includes: temperature, turbidity, width, depth, velocity, and volume of flow or discharge. The chemical tests conducted are: dissolved oxygen, carbon dioxide, pH, nitrates, alkalinity and phosphates. Macroinvertebrates are collected with a kick net, identified and then released. Students collect the data from several locations while traveling by canoe. In the classroom, students prepare a water quality data analysis and interpretation report based on their collected data. The summary of the results from longitudinal water quality data collection by students, as well as the strengths and weaknesses of student data collection will be presented.

Keywords: place-based, student data collection, sustainability, water quality monitoring

Procedia PDF Downloads 132
1208 Analysis of Solid Waste Management Practices and the Implications for Human Health and the Environment: A Case Study of Kayamandi Informal Settlement

Authors: Peter Iyobosa Asemota

Abstract:

This study on solid waste management practices addressed aspects of environmental and health impacts resulting from poor management of solid waste. The study was occasioned by the observed rate and volume of illegal and indiscriminate dumping of solid waste materials especially in informal settlements. The main focus of this study was to establish the impact of waste management practices on human health and the environment. The study, therefore, presents a critical analysis of the state of solid waste management in the study area and the implications for human health and the environment. The study was carried out in Kayamandi informal settlement within Stellenbosch municipality. The sustainable management of solid waste is very important in order to minimize the environmental and public health risks associated with improper solid waste management. There is no denying the fact that the problems of waste management will become critical as time goes on because of improper and inefficient waste management practices. Towns and cities exhibit the burdens of waste management which is a characteristics feature of most African cities. The study critically assess the implementation of waste management practices by the residents of the informal settlement; identify the factors affecting management issues in the operation of solid waste management system by the municipality; identify factors militating against the implementation of waste management policies and legislation. Furthermore, a waste assessment study was carried out to assess the generation; composition of the waste stream and also determine the attitudes and behavior of the residents with regard to waste management practices. Findings from the study revealed that Kayamandi is not different from other informal settlements with regards to waste management. People are of the opinion that solid waste management is the sole responsibility of municipal authorities and as such, the government should be responsible for bearing the cost of solid waste management.

Keywords: environment, waste, waste composition, waste stream, policy, waste categories, sanitary landfill, waste collection, integrated solid waste management

Procedia PDF Downloads 668
1207 A Knowledge-As-A-Service Support Framework for Ambient Learning in Kenya

Authors: Lucy W. Mburu, Richard Karanja, Simon N. Mwendia

Abstract:

Over recent years, learners have experienced a constant need to access on demand knowledge that is fully aligned with the paradigm of cloud computing. As motivated by the global sustainable development goal to ensure inclusive and equitable learning opportunities, this research has developed a framework hinged on the knowledge-as-a-service architecture that utilizes knowledge from ambient learning systems. Through statistical analysis and decision tree modeling, the study discovers influential variables for ambient learning among university students. The main aim is to generate a platform for disseminating and exploiting the available knowledge to aid the learning process and, thus, to improve educational support on the ambient learning system. The research further explores how collaborative effort can be used to form a knowledge network that allows access to heterogeneous sources of knowledge, which benefits knowledge consumers, such as the developers of ambient learning systems.

Keywords: actionable knowledge, ambient learning, cloud computing, decision trees, knowledge as a service

Procedia PDF Downloads 136
1206 Computing Customer Lifetime Value in E-Commerce Websites with Regard to Returned Orders and Payment Method

Authors: Morteza Giti

Abstract:

As online shopping is becoming increasingly popular, computing customer lifetime value for better knowing the customers is also gaining more importance. Two distinct factors that can affect the value of a customer in the context of online shopping is the number of returned orders and payment method. Returned orders are those which have been shipped but not collected by the customer and are returned to the store. Payment method refers to the way that customers choose to pay for the price of the order which are usually two: Pre-pay and Cash-on-delivery. In this paper, a novel model called RFMSP is presented to calculated the customer lifetime value, taking these two parameters into account. The RFMSP model is based on the common RFM model while adding two extra parameter. The S represents the order status and the P indicates the payment method. As a case study for this model, the purchase history of customers in an online shop is used to compute the customer lifetime value over a period of twenty months.

Keywords: RFMSP model, AHP, customer lifetime value, k-means clustering, e-commerce

Procedia PDF Downloads 298
1205 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 483
1204 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty

Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih

Abstract:

In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.

Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization

Procedia PDF Downloads 104
1203 Quantitative Evaluation of Mitral Regurgitation by Using Color Doppler Ultrasound

Authors: Shang-Yu Chiang, Yu-Shan Tsai, Shih-Hsien Sung, Chung-Ming Lo

Abstract:

Mitral regurgitation (MR) is a heart disorder which the mitral valve does not close properly when the heart pumps out blood. MR is the most common form of valvular heart disease in the adult population. The diagnostic echocardiographic finding of MR is straightforward due to the well-known clinical evidence. In the determination of MR severity, quantification of sonographic findings would be useful for clinical decision making. Clinically, the vena contracta is a standard for MR evaluation. Vena contracta is the point in a blood stream where the diameter of the stream is the least, and the velocity is the maximum. The quantification of vena contracta, i.e. the vena contracta width (VCW) at mitral valve, can be a numeric measurement for severity assessment. However, manually delineating the VCW may not accurate enough. The result highly depends on the operator experience. Therefore, this study proposed an automatic method to quantify VCW to evaluate MR severity. Based on color Doppler ultrasound, VCW can be observed from the blood flows to the probe as the appearance of red or yellow area. The corresponding brightness represents the value of the flow rate. In the experiment, colors were firstly transformed into HSV (hue, saturation and value) to be closely align with the way human vision perceives red and yellow. Using ellipse to fit the high flow rate area in left atrium, the angle between the mitral valve and the ultrasound probe was calculated to get the vertical shortest diameter as the VCW. Taking the manual measurement as the standard, the method achieved only 0.02 (0.38 vs. 0.36) to 0.03 (0.42 vs. 0.45) cm differences. The result showed that the proposed automatic VCW extraction can be efficient and accurate for clinical use. The process also has the potential to reduce intra- or inter-observer variability at measuring subtle distances.

Keywords: mitral regurgitation, vena contracta, color doppler, image processing

Procedia PDF Downloads 354
1202 Distributed Actor System for Traffic Simulation

Authors: Han Wang, Zhuoxian Dai, Zhe Zhu, Hui Zhang, Zhenyu Zeng

Abstract:

In traditional microscopic traffic simulation, various approaches have been suggested to implement the single-agent behaviors about lane changing and intelligent driver model. However, when it comes to very large metropolitan areas, microscopic traffic simulation requires more resources and become time-consuming, then macroscopic traffic simulation aggregate trends of interests rather than individual vehicle traces. In this paper, we describe the architecture and implementation of the actor system of microscopic traffic simulation, which exploits the distributed architecture of modern-day cloud computing. The results demonstrate that our architecture achieves high-performance and outperforms all the other traditional microscopic software in all tasks. To the best of our knowledge, this the first system that enables single-agent behavior in macroscopic traffic simulation. We thus believe it contributes to a new type of system for traffic simulation, which could provide individual vehicle behaviors in microscopic traffic simulation.

Keywords: actor system, cloud computing, distributed system, traffic simulation

Procedia PDF Downloads 173
1201 Performance Evaluation of Parallel Surface Modeling and Generation on Actual and Virtual Multicore Systems

Authors: Nyeng P. Gyang

Abstract:

Even though past, current and future trends suggest that multicore and cloud computing systems are increasingly prevalent/ubiquitous, this class of parallel systems is nonetheless underutilized, in general, and barely used for research on employing parallel Delaunay triangulation for parallel surface modeling and generation, in particular. The performances, of actual/physical and virtual/cloud multicore systems/machines, at executing various algorithms, which implement various parallelization strategies of the incremental insertion technique of the Delaunay triangulation algorithm, were evaluated. T-tests were run on the data collected, in order to determine whether various performance metrics differences (including execution time, speedup and efficiency) were statistically significant. Results show that the actual machine is approximately twice faster than the virtual machine at executing the same programs for the various parallelization strategies. Results, which furnish the scalability behaviors of the various parallelization strategies, also show that some of the differences between the performances of these systems, during different runs of the algorithms on the systems, were statistically significant. A few pseudo superlinear speedup results, which were computed from the raw data collected, are not true superlinear speedup values. These pseudo superlinear speedup values, which arise as a result of one way of computing speedups, disappear and give way to asymmetric speedups, which are the accurate kind of speedups that occur in the experiments performed.

Keywords: cloud computing systems, multicore systems, parallel Delaunay triangulation, parallel surface modeling and generation

Procedia PDF Downloads 182
1200 Rainfall and Flood Forecast Models for Better Flood Relief Plan of the Mae Sot Municipality

Authors: S. Chuenchooklin, S. Taweepong, U. Pangnakorn

Abstract:

This research was conducted in the Mae Sot Watershed whereas located in the Moei River Basin at the Upper Salween River Basin in Tak Province, Thailand. The Mae Sot Municipality is the largest urbanized in Tak Province and situated in the midstream of the Mae Sot Watershed. It usually faces flash flood problem after heavy rain due to poor flood management has been reported since economic rapidly bloom up in recently years. Its catchment can be classified as ungauged basin with lack of rainfall data and no any stream gaging station was reported. It was attached by most severely flood event in 2013 as the worst studied case for those all communities in this municipality. Moreover, other problems are also faced in this watershed such shortage water supply for domestic consumption and agriculture utilizations including deterioration of water quality and landslide as well. The research aimed to increase capability building and strengthening the participation of those local community leaders and related agencies to conduct better water management in urban area was started by mean of the data collection and illustration of appropriated application of some short period rainfall forecasting model as the aim for better flood relief plan and management through the hydrologic model system and river analysis system programs. The authors intended to apply the global rainfall data via the integrated data viewer (IDV) program from the Unidata with the aim for rainfall forecasting in short period of 7 - 10 days in advance during rainy season instead of real time record. The IDV product can be present in advance period of rainfall with time step of 3 - 6 hours was introduced to the communities. The result can be used to input to either the hydrologic modeling system model (HEC-HMS) or the soil water assessment tool model (SWAT) for synthesizing flood hydrographs and use for flood forecasting as well. The authors applied the river analysis system model (HEC-RAS) to present flood flow behaviors in the reach of the Mae Sot stream via the downtown of the Mae Sot City as flood extents as water surface level at every cross-sectional profiles of the stream. Both models of HMS and RAS were tested in 2013 with observed rainfall and inflow-outflow data from the Mae Sot Dam. The result of HMS showed fit to the observed data at dam and applied at upstream boundary discharge to RAS in order to simulate flood extents and tested in the field, and the result found satisfied. The result of IDV’s rainfall forecast data was compared to observed data and found fair. However, it is an appropriate tool to use in the ungauged catchment to use with flood hydrograph and river analysis models for future efficient flood relief plan and management.

Keywords: global rainfall, flood forecast, hydrologic modeling system, river analysis system

Procedia PDF Downloads 334