Search results for: high dimensional data
38869 Promoting Couple HIV Testing among Migrants for HIV Prevention: Learnings from Integrated Counselling and Testing Centre (ICTC) in Odisha, India
Authors: Sunil Mekale, Debasish Chowdhury, Sanchita Patnaik, Amitav Das, Ashok Agarwal
Abstract:
Background: Odisha is a low HIV prevalence state in India (ANC-HIV positivity of 0.42% as per HIV sentinel surveillance 2010-2011); however, it is an important source migration state with 3.2% of male migrants reporting to be PLHIV. USAID Public Health Foundation of India -PIPPSE project is piloting a source-destination corridor programme between Odisha and Gujarat. In Odisha, the focus has been on developing a comprehensive strategy to reach out to the out migrants and their spouses in the place of their origin based on their availability. The project has made concerted attempts to identify vulnerable districts with high out migration and high positivity rate. Description: 48 out of 97 ICTCs were selected from nine top high out migration districts through multistage sampling. A retrospective descriptive analysis of HIV positive male migrants and their spouses for two years (April 2013-March 2015) was conducted. A total of 3,645 HIV positive records were analysed. Findings: Among 34.2% detected HIV positive in the ICTCs, 23.3% were male migrants and 11% were spouses of male migrants; almost 50% of total ICTC attendees. More than 70% of the PLHIV male migrants and their spouses were less than 45 years old. Conclusions: Couple HIV testing approach may be considered for male migrants and their spouses. ICTC data analysis could guide in identifying the locations with high HIV positivity among male migrants and their spouses.Keywords: HIV testing, migrants, spouse of migrants, Integrated Counselling and Testing Centre (ICTC)
Procedia PDF Downloads 37938868 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles
Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang
Abstract:
With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering
Procedia PDF Downloads 12838867 A Study on Prediction Model for Thermally Grown Oxide Layer in Thermal Barrier Coating
Authors: Yongseok Kim, Jeong-Min Lee, Hyunwoo Song, Junghan Yun, Jungin Byun, Jae-Mean Koo, Chang-Sung Seok
Abstract:
Thermal barrier coating(TBC) is applied for gas turbine components to protect the components from extremely high temperature condition. Since metallic substrate cannot endure such severe condition of gas turbines, delamination of TBC can cause failure of the system. Thus, delamination life of TBC is one of the most important issues for designing the components operating at high temperature condition. Thermal stress caused by thermally grown oxide(TGO) layer is known as one of the major failure mechanisms of TBC. Thermal stress by TGO mainly occurs at the interface between TGO layer and ceramic top coat layer, and it is strongly influenced by the thickness and shape of TGO layer. In this study, Isothermal oxidation is conducted on coin-type TBC specimens prepared by APS(air plasma spray) method. After the isothermal oxidation at various temperature and time condition, the thickness and shape(rumpling shape) of the TGO is investigated, and the test data is processed by numerical analysis. Finally, the test data is arranged into a mathematical prediction model with two variables(temperature and exposure time) which can predict the thickness and rumpling shape of TGO.Keywords: thermal barrier coating, thermally grown oxide, thermal stress, isothermal oxidation, numerical analysis
Procedia PDF Downloads 34238866 Predictive Modelling of Aircraft Component Replacement Using Imbalanced Learning and Ensemble Method
Authors: Dangut Maren David, Skaf Zakwan
Abstract:
Adequate monitoring of vehicle component in other to obtain high uptime is the goal of predictive maintenance, the major challenge faced by businesses in industries is the significant cost associated with a delay in service delivery due to system downtime. Most of those businesses are interested in predicting those problems and proactively prevent them in advance before it occurs, which is the core advantage of Prognostic Health Management (PHM) application. The recent emergence of industry 4.0 or industrial internet of things (IIoT) has led to the need for monitoring systems activities and enhancing system-to-system or component-to- component interactions, this has resulted to a large generation of data known as big data. Analysis of big data represents an increasingly important, however, due to complexity inherently in the dataset such as imbalance classification problems, it becomes extremely difficult to build a model with accurate high precision. Data-driven predictive modeling for condition-based maintenance (CBM) has recently drowned research interest with growing attention to both academics and industries. The large data generated from industrial process inherently comes with a different degree of complexity which posed a challenge for analytics. Thus, imbalance classification problem exists perversely in industrial datasets which can affect the performance of learning algorithms yielding to poor classifier accuracy in model development. Misclassification of faults can result in unplanned breakdown leading economic loss. In this paper, an advanced approach for handling imbalance classification problem is proposed and then a prognostic model for predicting aircraft component replacement is developed to predict component replacement in advanced by exploring aircraft historical data, the approached is based on hybrid ensemble-based method which improves the prediction of the minority class during learning, we also investigate the impact of our approach on multiclass imbalance problem. We validate the feasibility and effectiveness in terms of the performance of our approach using real-world aircraft operation and maintenance datasets, which spans over 7 years. Our approach shows better performance compared to other similar approaches. We also validate our approach strength for handling multiclass imbalanced dataset, our results also show good performance compared to other based classifiers.Keywords: prognostics, data-driven, imbalance classification, deep learning
Procedia PDF Downloads 17438865 Single Layer Carbon Nanotubes Array as an Efficient Membrane for Desalination: A Molecular Dynamics Study
Authors: Elisa Y. M. Ang, Teng Yong Ng, Jingjie Yeo, Rongming Lin, Zishun Liu, K. R. Geethalakshmi
Abstract:
By stacking carbon nanotubes (CNT) one on top of another, single layer CNT arrays can perform water-salt separation with ultra-high permeability and selectivity. Such outer-wall CNT slit membrane is named as the transverse flow CNT membrane. By adjusting the slit size between neighboring CNTs, the membrane can be configured to sieve out different solutes, right down to the separation of monovalent salt ions from water. Molecular dynamics (MD) simulation results show that the permeability of transverse flow CNT membrane is more than two times that of conventional axial-flow CNT membranes, and orders of magnitude higher than current reverse osmosis membrane. In addition, by carrying out MD simulations with different CNT size, it was observed that the variance in desalination performance with CNT size is small. This insensitivity of the transverse flow CNT membrane’s performance to CNT size is a distinct advantage over axial flow CNT membrane designs. Not only does the membrane operate well under constant pressure desalination operation, but MD simulations further indicate that oscillatory operation can further enhance the membrane’s desalination performance, making it suitable for operation such as electrodialysis reversal. While there are still challenges that need to be overcome, particularly on the physical fabrication of such membrane, it is hope that this versatile membrane design can bring the idea of using low dimensional structures for desalination closer to reality.Keywords: carbon nanotubes, membrane desalination, transverse flow carbon nanotube membrane, molecular dynamics
Procedia PDF Downloads 19638864 Exploring Career Guidance Program for Students with Special Needs
Authors: Rahayu Azkiya
Abstract:
Career guidance is an integral part of education that aims to help students understand their interests, talents, and potential and provide direction in choosing an appropriate career path. Approximately 76 million people are working out of 17 million people with disabilities in 2022, and this number has become a focal point as career guidance is crucial among people with special needs. Therefore, this study explores how the career guidance program is implemented and what challenges are faced by teachers. This study employs a qualitative case study in one of the senior high schools for special needs (SMLB) in Depok, Indonesia. Meanwhile, the data analysis was done through thematic analysis. Data has been obtained through the interviews of two teachers who focused on the physically impaired and deaf. The results of this study show that (1) the school has implemented career guidance well, the students were selected in the first year to look for their talents and interests, and for the second and third years, students are trained to master their abilities. (2) There are still many challenges teachers face in implementing career guidance programs, such as a need for more human resources for both students and teachers, high curriculum demands, and simple facilities that hinder student progress. Therefore, this research shows that every child is unique, so schools must meet the standards of student needs and re-evaluate the various challenges that teachers and students still face. This research is expected to serve as an analysis material for the government's policy towards special needs schools in Indonesia.Keywords: Students with Special Needs, Career Guidance Program, Implementation, Challenges
Procedia PDF Downloads 5138863 Disruptive Innovation in Low-Income Countries: The Role of the Sharing Economy in Shaping the People Transportation Market in Nigeria
Authors: D. Tappi
Abstract:
In the past decades, the idea of innovation moved from being considered the result of development to being seen as its means. Innovation and its diffusion are indeed keys to the development and economic catch-up of a country. However, the process of diffusing existing innovation in low income countries has demonstrated dependent on inadequate infrastructures and institutions. The paper examines the role of disruptive innovation in bridging the technology gap between high- and low-income countries, overcoming the lack in infrastructures and institutions. In particular, the focus of this paper goes to the role of disruptive innovation in people transportation in Nigeria. Uber, Taxify, and Smartcab are covering a small and interesting market that was underserved, between the high-end private driver markets, the personal car owners and the low-priced traditional cab and the Keke (tricycle). Indeed the small Nigerian middle class and international community have found in the sharing people transportation market a safe, reasonably priced means of transportation in Nigerian big cities. This study uses mainly qualitative data collection methods in the form of semi-structured interviews with major players and users and quantitative data analysis in the form of a survey among users in order to assess the role of these new transportation modes in shaping the market and even creating a new niche. This paper shows how the new sharing economy in people transportation is creating new solutions to old problems as well as creating new challenges for both the existing market players and institutions. By doing so, the paper shows how disruptive innovations applied to low income countries, not only can overcome the lacking infrastructure problem but could also help bridge the technology gap between those and high income countries. This contribution proves that it is indeed exactly because the market presents these obstacles that disruptive innovations can succeed in countries such as Nigeria.Keywords: development, disruptive innovation, sharing economy, technology gap
Procedia PDF Downloads 12038862 Reduction of Impulsive Noise in OFDM System using Adaptive Algorithm
Authors: Alina Mirza, Sumrin M. Kabir, Shahzad A. Sheikh
Abstract:
The Orthogonal Frequency Division Multiplexing (OFDM) with high data rate, high spectral efficiency and its ability to mitigate the effects of multipath makes them most suitable in wireless application. Impulsive noise distorts the OFDM transmission and therefore methods must be investigated to suppress this noise. In this paper, a State Space Recursive Least Square (SSRLS) algorithm based adaptive impulsive noise suppressor for OFDM communication system is proposed. And a comparison with another adaptive algorithm is conducted. The state space model-dependent recursive parameters of proposed scheme enables to achieve steady state mean squared error (MSE), low bit error rate (BER), and faster convergence than that of some of existing algorithm.Keywords: OFDM, impulsive noise, SSRLS, BER
Procedia PDF Downloads 45838861 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality
Authors: Ouafa Sakka, Henri Barki, Louise Cote
Abstract:
Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development
Procedia PDF Downloads 29438860 Measuring Engagement Equation in Educational Institutes
Authors: Mahfoodh Saleh Al Sabbagh, Venkoba Rao
Abstract:
There is plenty of research, both in academic and consultancy circles, about the importance and benefits of employee engagement and customer engagement and how it gives organization an opportunity to reduce variability and improve performance. Customer engagement is directly related to the engagement level of the organization's employees. It is therefore important to measure both. This research drawing from the work of Human Sigma by Fleming and Asplund, attempts to assess engagement level of customer and employees - the human systems of business - in an educational setup. Student is important to an educational institute and is a customer to be served efficiently and effectively. Considering student as customer and faculty as employees serving them, in–depth interviews were conducted to analyze the relationship between faculty and student engagement in two leading colleges in Oman, one from private sector and another from public sector. The study relied mainly on secondary data sources to understand the concept of engagement. However, the search of secondary sources was extensive to compensate the limited primary data. The results indicate that high faculty engagement is likely to lead to high student engagement. Engaged students were excited about learning, loved the feeling of they being cared as a person by their faculty and advocated the organization to other. The interaction truly represents an opportunity to build emotional connection to the organization. This study could be of interest to organizations interest in building and maintaining engagement with employees and customers.Keywords: customer engagement, consumer psychology, strategy, educational institutes
Procedia PDF Downloads 47238859 A Network Economic Analysis of Friendship, Cultural Activity, and Homophily
Authors: Siming Xie
Abstract:
In social networks, the term homophily refers to the tendency of agents with similar characteristics to link with one another and is so robustly observed across many contexts and dimensions. The starting point of my research is the observation that the “type” of agents is not a single exogenous variable. Agents, despite their differences in race, religion, and other hard to alter characteristics, may share interests and engage in activities that cut across those predetermined lines. This research aims to capture the interactions of homophily effects in a model where agents have two-dimension characteristics (i.e., race and personal hobbies such as basketball, which one either likes or dislikes) and with biases in meeting opportunities and in favor of same-type friendships. A novel feature of my model is providing a matching process with biased meeting probability on different dimensions, which could help to understand the structuring process in multidimensional networks without missing layer interdependencies. The main contribution of this study is providing a welfare based matching process for agents with multi-dimensional characteristics. In particular, this research shows that the biases in meeting opportunities on one dimension would lead to the emergence of homophily on the other dimension. The objective of this research is to determine the pattern of homophily in network formations, which will shed light on our understanding of segregation and its remedies. By constructing a two-dimension matching process, this study explores a method to describe agents’ homophilous behavior in a social network with multidimension and construct a game in which the minorities and majorities play different strategies in a society. It also shows that the optimal strategy is determined by the relative group size, where society would suffer more from social segregation if the two racial groups have a similar size. The research also has political implications—cultivating the same characteristics among agents helps diminishing social segregation, but only if the minority group is small enough. This research includes both theoretical models and empirical analysis. Providing the friendship formation model, the author first uses MATLAB to perform iteration calculations, then derives corresponding mathematical proof on previous results, and last shows that the model is consistent with empirical evidence from high school friendships. The anonymous data comes from The National Longitudinal Study of Adolescent Health (Add Health).Keywords: homophily, multidimension, social networks, friendships
Procedia PDF Downloads 17038858 Depth-Averaged Velocity Distribution in Braided Channel Using Calibrating Coefficients
Authors: Spandan Sahu, Amiya Kumar Pati, Kishanjit Kumar Khatua
Abstract:
Rivers are the backbone of human civilization as well as one of the most important components of nature. In this paper, a method for predicting lateral depth-averaged velocity distribution in a two-flow braided compound channel is proposed. Experiments were conducted to study the boundary shear stress in the tip of the two flow path. The cross-section of the channel is divided into several panels to study the flow phenomenon on both the main channel and the flood plain. It can be inferred from the study that the flow coefficients get affected by boundary shear stress. In this study, the analytical solution of Shiono and knight (SKM) for lateral distributions of depth-averaged velocity and bed shear stress has been taken into account. The SKM is based on hydraulic parameters, which signify the bed friction factor (f), lateral eddy viscosity, and depth-averaged flow. While applying the SKM to different panels, the equations are solved considering the boundary conditions between panels. The boundary shear stress data, which are obtained from experimentation, are compared with CES software, which is based on quasi-one-dimensional Reynold's Averaged Navier-Stokes (RANS) approach.Keywords: boundary shear stress, lateral depth-averaged velocity, two-flow braided compound channel, velocity distribution
Procedia PDF Downloads 12938857 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach
Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier
Abstract:
The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis
Procedia PDF Downloads 10438856 Big Data Analysis with RHadoop
Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim
Abstract:
It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop
Procedia PDF Downloads 43738855 Organizational Innovations of the 20th Century as High Tech of the 21st: Evidence from Patent Data
Authors: Valery Yakubovich, Shuping wu
Abstract:
Organization theorists have long claimed that organizational innovations are nontechnological, in part because they are unpatentable. The claim rests on the assumption that organizational innovations are abstract ideas embodied in persons and contexts rather than in context-free practical tools. However, over the last three decades, organizational knowledge has been increasingly embodied in digital tools which, in principle, can be patented. To provide the first empirical evidence regarding the patentability of organizational innovations, we trained two machine learning algorithms to identify a population of 205,434 patent applications for organizational technologies (OrgTech) and, among them, 141,285 applications that use organizational innovations accumulated over the 20th century. Our event history analysis of the probability of patenting an OrgTech invention shows that ideas from organizational innovations decrease the probability of patent allowance unless they describe a practical tool. We conclude that the present-day digital transformation places organizational innovations in the realm of high tech and turns the debate about organizational technologies into the challenge of designing practical organizational tools that embody big ideas about organizing. We outline an agenda for patent-based research on OrgTech as an emerging phenomenon.Keywords: organizational innovation, organizational technology, high tech, patents, machine learning
Procedia PDF Downloads 12238854 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: data augmentation, mutex task generation, meta-learning, text classification.
Procedia PDF Downloads 9438853 CFD-DEM Modelling of Liquid Fluidizations of Ellipsoidal Particles
Authors: Esmaeil Abbaszadeh Molaei, Zongyan Zhou, Aibing Yu
Abstract:
The applications of liquid fluidizations have been increased in many parts of industries such as particle classification, backwashing of granular filters, crystal growth, leaching and washing, and bioreactors due to high-efficient liquid–solid contact, favorable mass and heat transfer, high operation flexibilities, and reduced back mixing of phases. In most of these multiphase operations the particles properties, i.e. size, density, and shape, may change during the process because of attrition, coalescence or chemical reactions. Previous studies, either experimentally or numerically, mainly have focused on studies of liquid-solid fluidized beds containing spherical particles; however, the role of particle shape on the hydrodynamics of liquid fluidized beds is still not well-known. A three-dimensional Discrete Element Model (DEM) and Computational Fluid Dynamics (CFD) are coupled to study the influence of particles shape on particles and liquid flow patterns in liquid-solid fluidized beds. In the simulations, ellipsoid particles are used to study the shape factor since they can represent a wide range of particles shape from oblate and sphere to prolate shape particles. Different particle shapes from oblate (disk shape) to elongated particles (rod shape) are selected to investigate the effect of aspect ratio on different flow characteristics such as general particles and liquid flow pattern, pressure drop, and particles orientation. First, the model is verified based on experimental observations, then further detail analyses are made. It was found that spherical particles showed a uniform particle distribution in the bed, which resulted in uniform pressure drop along the bed height. However for particles with aspect ratios less than one (disk-shape), some particles were carried into the freeboard region, and the interface between the bed and freeboard was not easy to be determined. A few particle also intended to leave the bed. On the other hand, prolate particles showed different behaviour in the bed. They caused unstable interface and some flow channeling was observed for low liquid velocities. Because of the non-uniform particles flow pattern for particles with aspect ratios lower (oblate) and more (prolate) than one, the pressure drop distribution in the bed was not observed as uniform as what was found for spherical particles.Keywords: CFD, DEM, ellipsoid, fluidization, multiphase flow, non-spherical, simulation
Procedia PDF Downloads 31038852 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network
Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan
Abstract:
Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.Keywords: aggregation point, data communication, data aggregation, wireless sensor network
Procedia PDF Downloads 16038851 Energy Efficiency Factors in Toll Plazas
Authors: S. Balubaid, M. Z. Abd Majid, R. Zakaria
Abstract:
Energy efficiency is one of the most important issues for green buildings and their sustainability. This is not only due to the environmental impacts, but also because of significantly high energy cost. The aim of this study is to identify the potential actions required for toll plaza that lead to energy reduction. The data were obtained through set of questionnaire and interviewing targeted respondents, including the employees at toll plaza, and architects and engineers who are directly involved in design of highway projects. The data was analyzed using descriptive statistics analysis method. The findings of this study are the critical elements that influence the energy usage and factors that lead to energy wastage. Finally, potential actions are recommended to reduce energy consumption in toll plazas.Keywords: energy efficiency, toll plaza, energy consumption
Procedia PDF Downloads 54738850 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data
Procedia PDF Downloads 59438849 A Coupling Study of Public Service Facilities and Land Price Based on Big Data Perspective in Wuxi City
Authors: Sisi Xia, Dezhuan Tao, Junyan Yang, Weiting Xiong
Abstract:
Under the background of Chinese urbanization changing from incremental development to stock development, the completion of urban public service facilities is essential to urban spatial quality. As public services facilities is a huge and complicated system, clarifying the various types of internal rules associated with the land market price is key to optimizing spatial layout. This paper takes Wuxi City as a representative sample location and establishes the digital analysis platform using urban price and several high-precision big data acquisition methods. On this basis, it analyzes the coupling relationship between different public service categories and land price, summarizing the coupling patterns of urban public facilities distribution and urban land price fluctuations. Finally, the internal mechanism within each of the two elements is explored, providing the reference of the optimum layout of urban planning and public service facilities.Keywords: public service facilities, land price, urban spatial morphology, big data
Procedia PDF Downloads 21538848 A NoSQL Based Approach for Real-Time Managing of Robotics's Data
Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir
Abstract:
This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.Keywords: NoSQL databases, database management systems, robotics, big data
Procedia PDF Downloads 35538847 Comparison between Ultra-High-Performance Concrete and Ultra-High-Performance-Glass Concrete
Authors: N. A. Soliman, A. F. Omran, A. Tagnit-Hamou
Abstract:
The finely ground waste glass has successfully used by the authors to develop and patent an ecological ultra-high-performance concrete (UHPC), which was named as ultra-high-performance-glass concrete (UHPGC). After the successful development in laboratory, the current research presents a comparison between traditional UHPC and UHPGC produced using large-scale pilot plant mixer, in terms of rheology, mechanical, and durability properties. The rheology of the UHPGCs was improved due to the non-absorptive nature of the glass particles. The mechanical performance of UHPGC was comparable and very close to the traditional UHPC due to the pozzolan reactivity of the amorphous waste glass. The UHPGC has also shown excellent durability: negligible permeability (chloride-ion ≈ 20 Coulombs from the RCPT test), high abrasion resistance (volume loss index less than 1.3), and almost no freeze-thaw deterioration even after 1000 freeze-thaw cycles. The enhancement in the strength and rigidity of the UHPGC mixture can be referred to the inclusions of the glass particles that have very high strength and elastic modulus.Keywords: ground glass pozzolan, large-scale production, sustainability, ultra-high performance glass concrete
Procedia PDF Downloads 15738846 Micromechanical Modelling of Ductile Damage with a Cohesive-Volumetric Approach
Authors: Noe Brice Nkoumbou Kaptchouang, Pierre-Guy Vincent, Yann Monerie
Abstract:
The present work addresses the modelling and the simulation of crack initiation and propagation in ductile materials which failed by void nucleation, growth, and coalescence. One of the current research frameworks on crack propagation is the use of cohesive-volumetric approach where the crack growth is modelled as a decohesion of two surfaces in a continuum material. In this framework, the material behavior is characterized by two constitutive relations, the volumetric constitutive law relating stress and strain, and a traction-separation law across a two-dimensional surface embedded in the three-dimensional continuum. Several cohesive models have been proposed for the simulation of crack growth in brittle materials. On the other hand, the application of cohesive models in modelling crack growth in ductile material is still a relatively open field. One idea developed in the literature is to identify the traction separation for ductile material based on the behavior of a continuously-deforming unit cell failing by void growth and coalescence. Following this method, the present study proposed a semi-analytical cohesive model for ductile material based on a micromechanical approach. The strain localization band prior to ductile failure is modelled as a cohesive band, and the Gurson-Tvergaard-Needleman plasticity model (GTN) is used to model the behavior of the cohesive band and derived a corresponding traction separation law. The numerical implementation of the model is realized using the non-smooth contact method (NSCD) where cohesive models are introduced as mixed boundary conditions between each volumetric finite element. The present approach is applied to the simulation of crack growth in nuclear ferritic steel. The model provides an alternative way to simulate crack propagation using the numerical efficiency of cohesive model with a traction separation law directly derived from porous continuous model.Keywords: ductile failure, cohesive model, GTN model, numerical simulation
Procedia PDF Downloads 14938845 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis
Authors: C. B. Le, V. N. Pham
Abstract:
In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering
Procedia PDF Downloads 18938844 Deployed Confidence: The Testing in Production
Authors: Shreya Asthana
Abstract:
Testers know that the feature they tested on stage is working perfectly in production only after release went live. Sometimes something breaks in production and testers get to know through the end user’s bug raised. The panic mode starts when your staging test results do not reflect current production behavior. And you started doubting your testing skills when finally the user reported a bug to you. Testers can deploy their confidence on release day by testing on production. Once you start doing testing in production, you will see test result accuracy because it will be running on real time data and execution will be a little faster as compared to staging one due to elimination of bad data. Feature flagging, canary releases, and data cleanup can help to achieve this technique of testing. By this paper it will be easier to understand the steps to achieve production testing before making your feature live, and to modify IT company’s testing procedure, so testers can provide the bug free experience to the end users. This study is beneficial because too many people think that testing should be done in staging but not in production and now this is high time to pull out people from their old mindset of testing into a new testing world. At the end of the day, it all just matters if the features are working in production or not.Keywords: bug free production, new testing mindset, testing strategy, testing approach
Procedia PDF Downloads 7738843 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria
Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur
Abstract:
The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria
Procedia PDF Downloads 37738842 Natural Frequency Analysis of Spinning Functionally Graded Cylindrical Shells Subjected to Thermal Loads
Authors: Esmaeil Bahmyari
Abstract:
The natural frequency analysis of the functionally graded (FG) rotating cylindrical shells subjected to thermal loads is studied based on the three-dimensional elasticity theory. The temperature-dependent assumption of the material properties is graded in the thickness direction, which varies based on the simple power law distribution. The governing equations and the appropriate boundary conditions, which include the effects of initial thermal stresses, are derived employing Hamilton’s principle. The initial thermo-mechanical stresses are obtained by the thermo-elastic equilibrium equation’s solution. As an efficient and accurate numerical tool, the differential quadrature method (DQM) is adopted to solve the thermo-elastic equilibrium equations, free vibration equations and natural frequencies are obtained. The high accuracy of the method is demonstrated by comparison studies with those existing solutions in the literature. Ultimately, the parametric studies are performed to demonstrate the effects of boundary conditions, temperature rise, material graded index, the thickness-to-length and the aspect ratios for the rotating cylindrical shells on the natural frequency.Keywords: free vibration, DQM, elasticity theory, FG shell, rotating cylindrical shell
Procedia PDF Downloads 8438841 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems
Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong
Abstract:
For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.Keywords: differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization
Procedia PDF Downloads 39738840 A CFD Analysis of Flow through a High-Pressure Natural Gas Pipeline with an Undeformed and Deformed Orifice Plate
Authors: R. Kiš, M. Malcho, M. Janovcová
Abstract:
This work aims to present a numerical analysis of the natural gas which flows through a high-pressure pipeline and an orifice plate, through the use of CFD methods. The paper contains CFD calculations for the flow of natural gas in a pipe with different geometry used for the orifice plates. One of them has a standard geometry and a shape without any deformation and the other is deformed by the action of the pressure differential. It shows the behaviour of natural gas in a pipeline using the velocity profiles and pressure fields of the gas in both models with their differences. The entire research is based on the elimination of any inaccuracy which should appear in the flow of the natural gas measured in the high-pressure pipelines of the gas industry and which is currently not given in the relevant standard.Keywords: orifice plate, high-pressure pipeline, natural gas, CFD analysis
Procedia PDF Downloads 383