Search results for: imperfect channel state information
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17758

Search results for: imperfect channel state information

8548 Nickel Electroplating in Post Supercritical CO2 Mixed Watts Bath under Different Agitations

Authors: Chun-Ying Lee, Kun-Hsien Lee, Bor-Wei Wang

Abstract:

The process of post-supercritical CO2 electroplating uses the electrolyte solution after being mixed with supercritical CO2 and released to atmospheric pressure. It utilizes the microbubbles that form when oversaturated CO2 in the electrolyte returns to gaseous state, which gives the similar effect of pulsed electroplating. Under atmospheric pressure, the CO2 bubbles gradually diffuse. Therefore, the introduction of ultrasound and/or other agitation can potentially excite the CO2 microbubbles to achieve an electroplated surface of even higher quality. In this study, during the electroplating process, three different modes of agitation: magnetic stirrer agitation, ultrasonic agitation and a combined mode (magnetic + ultrasonic) were applied, respectively, in order to obtain an optimal surface morphology and mechanical properties for the electroplated Ni coating. It is found that the combined agitation mode at a current density of 40 A/dm2 achieved the smallest grain size, lower surface roughness, and produced an electroplated Ni layer that achieved hardness of 320 HV, much higher when compared with conventional method, which were usually in the range of 160 to 300 HV. However, at the same time, the electroplating with combined agitation developed a higher internal stress of 320 MPa due to the lower current efficiency of the process and finer grain in the coating. Moreover, a new control methodology for tailoring the coating’s mechanical property through its thickness was demonstrated by the timely introduction of ultrasonic agitation during the electroplating process with post supercritical CO2 mixed electrolyte.

Keywords: nickel electroplating, micro-bubbles, supercritical carbon dioxide, ultrasonic agitation

Procedia PDF Downloads 264
8547 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 48
8546 Analysis of Complex Business Negotiations: Contributions from Agency-Theory

Authors: Jan Van Uden

Abstract:

The paper reviews classical agency-theory and its contributions to the analysis of complex business negotiations and gives an approach for the modification of the basic agency-model in order to examine the negotiation specific dimensions of agency-problems. By illustrating fundamental potentials for the modification of agency-theory in context of business negotiations the paper highlights recent empirical research that investigates agent-based negotiations and inter-team constellations. A general theoretical analysis of complex negotiation would be based on a two-level approach. First, the modification of the basic agency-model in order to illustrate the organizational context of business negotiations (i.e., multi-agent issues, common-agencies, multi-period models and the concept of bounded rationality). Second, the application of the modified agency-model on complex business negotiations to identify agency-problems and relating areas of risk in the negotiation process. The paper is placed on the first level of analysis – the modification. The method builds on the one hand on insights from behavior decision research (BRD) and on the other hand on findings from agency-theory as normative directives to the modification of the basic model. Through neoclassical assumptions concerning the fundamental aspects of agency-relationships in business negotiations (i.e., asymmetric information, self-interest, risk preferences and conflict of interests), agency-theory helps to draw solutions on stated worst-case-scenarios taken from the daily negotiation routine. As agency-theory is the only universal approach able to identify trade-offs between certain aspects of economic cooperation, insights obtained provide a deeper understanding of the forces that shape business negotiation complexity. The need for a modification of the basic model is illustrated by highlighting selected issues of business negotiations from agency-theory perspective: Negotiation Teams require a multi-agent approach under the condition that often decision-makers as superior-agents are part of the team. The diversity of competences and decision-making authority is a phenomenon that overrides the assumptions of classical agency-theory and varies greatly in context of certain forms of business negotiations. Further, the basic model is bound to dyadic relationships preceded by the delegation of decision-making authority and builds on a contractual created (vertical) hierarchy. As a result, horizontal dynamics within the negotiation team playing an important role for negotiation success are therefore not considered in the investigation of agency-problems. Also, the trade-off between short-term relationships within the negotiation sphere and the long-term relationships of the corporate sphere calls for a multi-period perspective taking into account the sphere-specific governance-mechanisms already established (i.e., reward and monitoring systems). Within the analysis, the implementation of bounded rationality is closely related to findings from BRD to assess the impact of negotiation behavior on underlying principal-agent-relationships. As empirical findings show, the disclosure and reservation of information to the agent affect his negotiation behavior as well as final negotiation outcomes. Last, in context of business negotiations, asymmetric information is often intended by decision-makers acting as superior-agents or principals which calls for a bilateral risk-approach to agency-relations.

Keywords: business negotiations, agency-theory, negotiation analysis, interteam negotiations

Procedia PDF Downloads 125
8545 Analysis of Ionosphere Anomaly Before Great Earthquake in Java on 2009 Using GPS Tec Data

Authors: Aldilla Damayanti Purnama Ratri, Hendri Subakti, Buldan Muslim

Abstract:

Ionosphere’s anomalies as an effect of earthquake activity is a phenomenon that is now being studied in seismo-ionospheric coupling. Generally, variation in the ionosphere caused by earthquake activity is weaker than the interference generated by different source, such as geomagnetic storms. However, disturbances of geomagnetic storms show a more global behavior, while the seismo-ionospheric anomalies occur only locally in the area which is largely determined by magnitude of the earthquake. It show that the earthquake activity is unique and because of its uniqueness it has been much research done thus expected to give clues as early warning before earthquake. One of the research that has been developed at this time is the approach of seismo-ionospheric-coupling. This study related the state in the lithosphere-atmosphere and ionosphere before and when earthquake occur. This paper choose the total electron content in a vertical (VTEC) in the ionosphere as a parameter. Total Electron Content (TEC) is defined as the amount of electron in vertical column (cylinder) with cross-section of 1m2 along GPS signal trajectory in ionosphere at around 350 km of height. Based on the analysis of data obtained from the LAPAN agency to identify abnormal signals by statistical methods, obtained that there are an anomaly in the ionosphere is characterized by decreasing of electron content of the ionosphere at 1 TECU before the earthquake occurred. Decreasing of VTEC is not associated with magnetic storm that is indicated as an earthquake precursor. This is supported by the Dst index showed no magnetic interference.

Keywords: earthquake, DST Index, ionosphere, seismoionospheric coupling, VTEC

Procedia PDF Downloads 575
8544 Statistical Time-Series and Neural Architecture of Malaria Patients Records in Lagos, Nigeria

Authors: Akinbo Razak Yinka, Adesanya Kehinde Kazeem, Oladokun Oluwagbenga Peter

Abstract:

Time series data are sequences of observations collected over a period of time. Such data can be used to predict health outcomes, such as disease progression, mortality, hospitalization, etc. The Statistical approach is based on mathematical models that capture the patterns and trends of the data, such as autocorrelation, seasonality, and noise, while Neural methods are based on artificial neural networks, which are computational models that mimic the structure and function of biological neurons. This paper compared both parametric and non-parametric time series models of patients treated for malaria in Maternal and Child Health Centres in Lagos State, Nigeria. The forecast methods considered linear regression, Integrated Moving Average, ARIMA and SARIMA Modeling for the parametric approach, while Multilayer Perceptron (MLP) and Long Short-Term Memory (LSTM) Network were used for the non-parametric model. The performance of each method is evaluated using the Mean Absolute Error (MAE), R-squared (R2) and Root Mean Square Error (RMSE) as criteria to determine the accuracy of each model. The study revealed that the best performance in terms of error was found in MLP, followed by the LSTM and ARIMA models. In addition, the Bootstrap Aggregating technique was used to make robust forecasts when there are uncertainties in the data.

Keywords: ARIMA, bootstrap aggregation, MLP, LSTM, SARIMA, time-series analysis

Procedia PDF Downloads 63
8543 A Real-Time Bayesian Decision-Support System for Predicting Suspect Vehicle’s Intended Target Using a Sparse Camera Network

Authors: Payam Mousavi, Andrew L. Stewart, Huiwen You, Aryeh F. G. Fayerman

Abstract:

We present a decision-support tool to assist an operator in the detection and tracking of a suspect vehicle traveling to an unknown target destination. Multiple data sources, such as traffic cameras, traffic information, weather, etc., are integrated and processed in real-time to infer a suspect’s intended destination chosen from a list of pre-determined high-value targets. Previously, we presented our work in the detection and tracking of vehicles using traffic and airborne cameras. Here, we focus on the fusion and processing of that information to predict a suspect’s behavior. The network of cameras is represented by a directional graph, where the edges correspond to direct road connections between the nodes and the edge weights are proportional to the average time it takes to travel from one node to another. For our experiments, we construct our graph based on the greater Los Angeles subset of the Caltrans’s “Performance Measurement System” (PeMS) dataset. We propose a Bayesian approach where a posterior probability for each target is continuously updated based on detections of the suspect in the live video feeds. Additionally, we introduce the concept of ‘soft interventions’, inspired by the field of Causal Inference. Soft interventions are herein defined as interventions that do not immediately interfere with the suspect’s movements; rather, a soft intervention may induce the suspect into making a new decision, ultimately making their intent more transparent. For example, a soft intervention could be temporarily closing a road a few blocks from the suspect’s current location, which may require the suspect to change their current course. The objective of these interventions is to gain the maximum amount of information about the suspect’s intent in the shortest possible time. Our system currently operates in a human-on-the-loop mode where at each step, a set of recommendations are presented to the operator to aid in decision-making. In principle, the system could operate autonomously, only prompting the operator for critical decisions, allowing the system to significantly scale up to larger areas and multiple suspects. Once the intended target is identified with sufficient confidence, the vehicle is reported to the authorities to take further action. Other recommendations include a selection of road closures, i.e., soft interventions, or to continue monitoring. We evaluate the performance of the proposed system using simulated scenarios where the suspect, starting at random locations, takes a noisy shortest path to their intended target. In all scenarios, the suspect’s intended target is unknown to our system. The decision thresholds are selected to maximize the chances of determining the suspect’s intended target in the minimum amount of time and with the smallest number of interventions. We conclude by discussing the limitations of our current approach to motivate a machine learning approach, based on reinforcement learning in order to relax some of the current limiting assumptions.

Keywords: autonomous surveillance, Bayesian reasoning, decision support, interventions, patterns of life, predictive analytics, predictive insights

Procedia PDF Downloads 107
8542 Factors Influencing the Usage of ERP in Enterprise Systems

Authors: Mohammad Reza Babaei, Sanaz Kamrani

Abstract:

The main problems That arise In adopting most Enterprise resources planning (ERP) strategies come from organizational, complex information systems like the ERP integrate the data of all business areas within the organization. The implementation of ERP is a difficult process as it involves different types of end users. Based on literature, we proposed a conceptual framework and examined it to find the effect of some of the individual, organizational, and technological factors on the usage of ERP and its impact on the end user. The results of the analysis suggest that computer self-efficacy, organizational support, training, and compatibility have a positive influence on ERP usage which in turn has significant influence on panoptic empowerment and individual performance.

Keywords: factor, influencing, enterprise, system

Procedia PDF Downloads 354
8541 Access to Natural Resources in the Cameroonian Part of the Logone Basin: A Driver and Mitigation Tool to Ethnical Conflicts

Authors: Bonguen Onouck Rolande Carole, Ndongo Barthelemy

Abstract:

The climate change effects on the Lake Chad, coupled with population growth, have pushed large masses of people of various origins towards the lower part of the lower Logonewatershed in search of the benefits of environmental services, causing pressure on the environment and its resources. Economic services are therefore threatened, and the decrease in resources contributes to the deterioration of the social wellbeing resulting to conflicts among/between local communities, immigrants, displaced people, and foreigners. This paper is an information contribution on ethnical conflicts drivers in the area and the provided local management mechanisms such can help mitigate present or future conflicts in similar areas. It also prints out the necessity to alleviate water access deficit and encourage good practices for the population wellbeing. In order to meet the objective, in 2018, through the interface of the World Bank-Cameroon project-PULCI, data were collected on the field directly by discussing with the population and visiting infrastructures, indirectly by a questionnaire survey. Two administrative divisions were chosen (Logoneet Chari, Mayo-Danay) in which targeted localities were Zina, Mazera, Lahai, Andirni near the Waza Park and Yagoua, Tekele, Pouss, respectively. Due to some sociocultural and religious reasons, some information were acquired through the traditional chiefs. A desk study analysis based on resources access and availability conflicts history, and management mechanism was done. As results, roots drivers of ethnical conflicts are struggles over natural resources access, and the possibility of conflicts increases as the scarcity and vulnerabilities persist, creating more sociocultural gaps and tensions. The mitigation mechanisms though fruitful, are limited. There is poor documentation on the topic, the resources management policies of this basin are unsuitable and ineffective for some. Therefore, the restoration of environmental and ecosystems, the mitigation of climate change effects, and food insecurity are the challenges that must be met to alleviate conflicts in these localities.

Keywords: ethnic, communities, conflicts, mitigation mechanisms, natural resources, logone basin

Procedia PDF Downloads 90
8540 Re-Inhabiting the Roof: Han Slawick Covered Roof Terrace, Amsterdam

Authors: Simone Medio

Abstract:

If we observe many modern cities from above, we are typically confronted with a sea of asphalt-clad flat rooftops. In contrast to the modernist expectation of a populated flat roof, flat rooftops in modern multi-story buildings are rarely used. On the contrary, they typify a desolate and abandoned landscape encouraging mechanical system allocation. Flat roof technology continues to be seen as a state-of-fact in most multi-storey building designs and its greening its prevalent environmental justification. This paper aims to seek a change in the approach to flat roofing. It makes a case for the opportunity at hand for architectonically resolute, sheltered, livable spaces that make a better use of the environment at rooftop level. The researcher is looking for the triggers that allow for that change to happen in the design process of case study buildings. The paper begins by exploring Han Slawick covered roof terrace in Amsterdam as a simple and essential example of transforming the flat roof in a usable, inhabitable space. It investigates the design challenges and the logistic, financial and legislative hurdles faced by the architect, and the outcomes in terms of building performance and occupant use and satisfaction. The researcher uses a grounded research methodology with direct interview process to the architect in charge of the building and the building user. Energy simulation tools and calculation of running costs are also used as further means of validating change.

Keywords: environmental design, flat rooftop persistence, roof re-habitation, tectonics

Procedia PDF Downloads 262
8539 Theoretical Study of Electronic Structure of Erbium (Er), Fermium (Fm), and Nobelium (No)

Authors: Saleh O. Allehabi, V. A. Dzubaa, V. V. Flambaum, Jiguang Li, A. V. Afanasjev, S. E. Agbemava

Abstract:

Recently developed versions of the configuration method for open shells, configuration interaction with perturbation theory (CIPT), and configuration interaction with many-body perturbation theory (CI+MBPT) techniques are used to study the electronic structure of Er, Fm, and No atoms. Excitation energies of odd states connected to the even ground state by electric dipole transitions, the corresponding transition rates, isotope shift, hyperfine structure, ionization potentials, and static scalar polarizabilities are calculated. The way of extracting parameters of nuclear charge distribution beyond nuclear root mean square (RMS) radius, e.g., a parameter of quadrupole deformation β, is demonstrated. In nuclei with spin > 1/2, parameter β is extracted from the quadrupole hyperfine structure. With zero nuclear spin or spin 1/2, it is impossible since quadrupole zero, so a different method was developed. The measurements of at least two atomic transitions are needed to disentangle the contributions of the changes in deformation and nuclear RMS radius into field isotopic shift. This is important for testing nuclear theory and for searching for the hypothetical island of stability. Fm and No are heavy elements approaching the superheavy region, for which the experimental data are very poor, only seven lines for the Fm element and one line for the No element. Since Er and Fm have similar electronic structures, calculations for Er serve as a guide to the accuracy of the calculations. Twenty-eight new levels of Fm atom are reported.

Keywords: atomic spectra, electronic transitions, isotope effect, electron correlation calculations for atoms

Procedia PDF Downloads 143
8538 Fracture Control of the Soda-Lime Glass in Laser Thermal Cleavage

Authors: Jehnming Lin

Abstract:

The effects of the contact ball-lens on the soda lime glass in laser thermal cleavage with a cw Nd-YAG laser were investigated in this study. A contact ball-lens was adopted to generate a bending force on the crack formation of the soda-lime glass in the laser cutting process. The Nd-YAG laser beam (wavelength of 1064 nm) was focused through the ball-lens and transmitted to the soda-lime glass, which was coated with a carbon film on the surface with a bending force from a ball-lens to generate a tensile stress state on the surface cracking. The fracture was controlled by the contact ball-lens and a straight cutting was tested to demonstrate the feasibility. Experimental observations on the crack propagation from the leading edge, main section and trailing edge of the glass sheet were compared with various mechanical and thermal loadings. Further analyses on the stress under various laser powers and contact ball loadings were made to characterize the innovative technology. The results show that the distributions of the side crack at the leading and trailing edges are mainly dependent on the boundary condition, contact force, cutting speed and laser power. With the increase of the mechanical and thermal loadings, the region of the side cracks might be dramatically reduced with proper selection of the geometrical constraints. Therefore, the application of the contact ball-lens is a possible way to control the fracture in laser cleavage with improved cutting qualities.

Keywords: laser cleavage, stress analysis, crack visualization, laser

Procedia PDF Downloads 426
8537 The Investigation of Women Civil Engineers’ Identity Development through the Lens of Recognition Theory

Authors: Hasan Sungur, Evrim Baran, Benjamin Ahn, Aliye Karabulut Ilgu, Chris Rehmann, Cassandra Rutherford

Abstract:

Engineering identity contributes to the professional and educational persistence of women engineers. A crucial factor contributing to the development of the engineering identity is recognition. Those without adequate recognition often do not succeed in positively building their identities. This research draws on Honneth’s recognition theory to identify factors impacting women civil engineers’ feelings of recognition as civil engineers. A survey was composed and distributed to 330 female alumni who graduated from the Department of Civil, Construction, and Environmental Engineering at Iowa State University in the last ten years. The survey items include demographics, perceptions of the identity of civil engineering, and factors that influence the recognition of civil engineering identities, such as views of society and family. Descriptive analysis of the survey responses revealed that the perceptions of civil engineering varied widely. Participants’ definitions of civil engineering included the terms: construction, design, and infrastructure. Almost half of the participants reported that the major reason to study civil engineering was their interest in the subject matter, and most reported that they were proud to be civil engineers. Many study participants reported that their parents see them as civil engineers. Treatment of institutions and the workplace were also considered as having a significant impact on the recognition of women civil engineers. Almost half of the participants reported that they felt isolated or ignored at work because of their gender. This research emphasizes the importance of recognition for the development of the civil engineering identity of women

Keywords: civil engineering, gender, identity, recognition

Procedia PDF Downloads 234
8536 Assessment of Seeding and Weeding Field Robot Performance

Authors: Victor Bloch, Eerikki Kaila, Reetta Palva

Abstract:

Field robots are an important tool for enhancing efficiency and decreasing the climatic impact of food production. There exists a number of commercial field robots; however, since this technology is still new, the robot advantages and limitations, as well as methods for optimal using of robots, are still unclear. In this study, the performance of a commercial field robot for seeding and weeding was assessed. A research 2-ha sugar beet field with 0.5m row width was used for testing, which included robotic sowing of sugar beet and weeding five times during the first two months of the growing. About three and five percent of the field were used as untreated and chemically weeded control areas, respectively. The plant detection was based on the exact plant location without image processing. The robot was equipped with six seeding and weeding tools, including passive between-rows harrow hoes and active hoes cutting inside rows between the plants, and it moved with a maximal speed of 0.9 km/h. The robot's performance was assessed by image processing. The field images were collected by an action camera with a height of 2 m and a resolution 27M pixels installed on the robot and by a drone with a 16M pixel camera flying at 4 m height. To detect plants and weeds, the YOLO model was trained with transfer learning from two available datasets. A preliminary analysis of the entire field showed that in the areas treated by the robot, the weed average density varied across the field from 6.8 to 9.1 weeds/m² (compared with 0.8 in the chemically treated area and 24.3 in the untreated area), the weed average density inside rows was 2.0-2.9 weeds / m (compared with 0 on the chemically treated area), and the emergence rate was 90-95%. The information about the robot's performance has high importance for the application of robotics for field tasks. With the help of the developed method, the performance can be assessed several times during the growth according to the robotic weeding frequency. When it’s used by farmers, they can know the field condition and efficiency of the robotic treatment all over the field. Farmers and researchers could develop optimal strategies for using the robot, such as seeding and weeding timing, robot settings, and plant and field parameters and geometry. The robot producers can have quantitative information from an actual working environment and improve the robots accordingly.

Keywords: agricultural robot, field robot, plant detection, robot performance

Procedia PDF Downloads 60
8535 Preparation and Characterization of Water-in-Oil Nanoemulsion of 5-Fluorouracil to Enhance Skin Permeation for Treatment of Skin Diseases.

Authors: P. S. Rajinikanth, Shobana Mariappan, Jestin Chellian

Abstract:

The objective of the study was to prepare and characterize a water-in-oil nano emulsion of 5-Fluorouracil (5FU) to enhance the skin penetration. The present study describes a nano emulsion of 5FU using Capyrol PGMC, Transcutol HP and PEG 400 as oil, surfactant and co-surfactant, respectively. The optimized formulations were further evaluated for heating cooling cycle, centrifugation studies, freeze thaw cycling, particle size distribution and zeta potential in order to confirm the stability of the optimized nano emulsions. The in-vitro characterization results showed that the droplets of prepared formulation were ~100 nm with ± 15 zeta potential. In vitro skin permeation studies was conducted in albino mice skin. Significant increase in permeability parameters was also observed in nano emulsion formulations (P<0.05). The steady-state flux (Jss), enhancement ration and permeability coefficient (Kp) for optimized nano emulsion formulation (FU2, FU1, 1:1 S mix were found to be 24.21 ±2.45 μg/cm2/h, 3.28±0.87 & 19.52±1.87 cm/h, respectively), which were significant compared with conventional gel. The in vitro and in vivo skin deposition studies in rat indicated that the amount of drug deposited from the nano emulsion (292.45 µg/cm2) in skin was significant (P<0.05) an increased as compared to a conventional 5FU gel (121.42 µg/cm2). The skin irritation study using rat skin showed that the mean irritation index of the nano emulsion reduced significantly (P<0.05) as compared with conventional gel contain 1% 5FU. The results from this study suggest that a water-in-oil nano emulsion could be safely used to promote skin penetration of 5FU following topical application.

Keywords: nano emulsion, controlled release, 5 fluorouracil, skin penetration, skin irritation

Procedia PDF Downloads 490
8534 Cost Overruns in Mega Projects: Project Progress Prediction with Probabilistic Methods

Authors: Yasaman Ashrafi, Stephen Kajewski, Annastiina Silvennoinen, Madhav Nepal

Abstract:

Mega projects either in construction, urban development or energy sectors are one of the key drivers that build the foundation of wealth and modern civilizations in regions and nations. Such projects require economic justification and substantial capital investment, often derived from individual and corporate investors as well as governments. Cost overruns and time delays in these mega projects demands a new approach to more accurately predict project costs and establish realistic financial plans. The significance of this paper is that the cost efficiency of megaprojects will improve and decrease cost overruns. This research will assist Project Managers (PMs) to make timely and appropriate decisions about both cost and outcomes of ongoing projects. This research, therefore, examines the oil and gas industry where most mega projects apply the classic methods of Cost Performance Index (CPI) and Schedule Performance Index (SPI) and rely on project data to forecast cost and time. Because these projects are always overrun in cost and time even at the early phase of the project, the probabilistic methods of Monte Carlo Simulation (MCS) and Bayesian Adaptive Forecasting method were used to predict project cost at completion of projects. The current theoretical and mathematical models which forecast the total expected cost and project completion date, during the execution phase of an ongoing project will be evaluated. Earned Value Management (EVM) method is unable to predict cost at completion of a project accurately due to the lack of enough detailed project information especially in the early phase of the project. During the project execution phase, the Bayesian adaptive forecasting method incorporates predictions into the actual performance data from earned value management and revises pre-project cost estimates, making full use of the available information. The outcome of this research is to improve the accuracy of both cost prediction and final duration. This research will provide a warning method to identify when current project performance deviates from planned performance and crates an unacceptable gap between preliminary planning and actual performance. This warning method will support project managers to take corrective actions on time.

Keywords: cost forecasting, earned value management, project control, project management, risk analysis, simulation

Procedia PDF Downloads 381
8533 Health Trajectory Clustering Using Deep Belief Networks

Authors: Farshid Hajati, Federico Girosi, Shima Ghassempour

Abstract:

We present a Deep Belief Network (DBN) method for clustering health trajectories. Deep Belief Network (DBN) is a deep architecture that consists of a stack of Restricted Boltzmann Machines (RBM). In a deep architecture, each layer learns more complex features than the past layers. The proposed method depends on DBN in clustering without using back propagation learning algorithm. The proposed DBN has a better a performance compared to the deep neural network due the initialization of the connecting weights. We use Contrastive Divergence (CD) method for training the RBMs which increases the performance of the network. The performance of the proposed method is evaluated extensively on the Health and Retirement Study (HRS) database. The University of Michigan Health and Retirement Study (HRS) is a nationally representative longitudinal study that has surveyed more than 27,000 elderly and near-elderly Americans since its inception in 1992. Participants are interviewed every two years and they collect data on physical and mental health, insurance coverage, financial status, family support systems, labor market status, and retirement planning. The dataset is publicly available and we use the RAND HRS version L, which is easy to use and cleaned up version of the data. The size of sample data set is 268 and the length of the trajectories is equal to 10. The trajectories do not stop when the patient dies and represent 10 different interviews of live patients. Compared to the state-of-the-art benchmarks, the experimental results show the effectiveness and superiority of the proposed method in clustering health trajectories.

Keywords: health trajectory, clustering, deep learning, DBN

Procedia PDF Downloads 355
8532 Extraction of Text Subtitles in Multimedia Systems

Authors: Amarjit Singh

Abstract:

In this paper, a method for extraction of text subtitles in large video is proposed. The video data needs to be annotated for many multimedia applications. Text is incorporated in digital video for the motive of providing useful information about that video. So need arises to detect text present in video to understanding and video indexing. This is achieved in two steps. First step is text localization and the second step is text verification. The method of text detection can be extended to text recognition which finds applications in automatic video indexing; video annotation and content based video retrieval. The method has been tested on various types of videos.

Keywords: video, subtitles, extraction, annotation, frames

Procedia PDF Downloads 590
8531 Use of Six-sigma Concept in Discrete Manufacturing Industry

Authors: Ignatio Madanhire, Charles Mbohwa

Abstract:

Efficiency in manufacturing is critical in raising the value of exports so as to gainfully trade on the regional and international markets. There seems to be increasing popularity of continuous improvement strategies availed to manufacturing entities, but this research study established that there has not been a similar popularity accorded to the Six Sigma methodology. Thus this work was conducted to investigate the applicability, effectiveness, usefulness, application and suitability of the Six Sigma methodology as a competitiveness option for discrete manufacturing entity. Development of Six-sigma center in the country with continuous improvement information would go a long way in benefiting the entire industry

Keywords: discrete manufacturing, six-sigma, continuous improvement, efficiency, competitiveness

Procedia PDF Downloads 447
8530 High Piezoelectric and Magnetic Performance Achieved in the Lead-free BiFeO3-BaTiO3 Cceramics by Defect Engineering

Authors: Muhammad Habib, Xuefan Zhou, Lin Tang, Guoliang Xue, Fazli Akram, Dou Zhang

Abstract:

Defect engineering approach is a well-established approach for the customization of functional properties of perovskite ceramics. In modern technology, the high multiferroic properties for elevated temperature applications are greatly demanding. In this work, the Bi-nonstoichiometric lead-free 0.67Biy-xSmxFeO3-0.33BaTiO3 ceramics (Sm-doped BF-BT for Bi-excess; y = 1.03 and Bi-deficient; y = 0.975 with x = 0.00, 0.04 and 0.08) were design for the high-temperature multiferroic property. Enhanced piezoelectric (d33  250 pC/N and d33* 350 pm/V) and magnetic properties (Mr  0.25 emu/g) with a high Curie temperature (TC  465 ℃) were obtained in the Bi-deficient pure BF-BT ceramics. With Sm-doping (x = 0.04), the TC decrease to 350 ℃ a significant improvement occurred in the d33* to 504 pm/V and 450 pm/V for Bi-excess and Bi-deficient compositions, respectively. The structural origin of the enhanced piezoelectric strain performance is related to the soft ferroelectric effect by Sm-doping and reversible phase transition from the short-range relaxor ferroelectric state to the long-range order under the applied electric field. However, a slight change occurs in the Mr 0.28 emu/g value with Sm-doping for Bi-deficient ceramics, whereas the Bi-excess ceramics shows completely paramagnetic behavior. Hence, the origin of high magnetic properties in the Bi-deficient BF-BT ceramics is mainly attributed to the proposed double exchange mechanism. We believe that this strategy will provide a new perspective for the development of lead-free multiferroic ceramics for high-temperature applications.

Keywords: BiFeO3-BaTiO3, lead-free piezoceramics, magnetic properties, defect engineering

Procedia PDF Downloads 120
8529 The Condition Testing of Damaged Plates Using Acoustic Features and Machine Learning

Authors: Kyle Saltmarsh

Abstract:

Acoustic testing possesses many benefits due to its non-destructive nature and practicality. There hence exists many scenarios in which using acoustic testing for condition testing shows powerful feasibility. A wealth of information is contained within the acoustic and vibration characteristics of structures, allowing the development meaningful features for the classification of their respective condition. In this paper, methods, results, and discussions are presented on the use of non-destructive acoustic testing coupled with acoustic feature extraction and machine learning techniques for the condition testing of manufactured circular steel plates subjected to varied levels of damage.

Keywords: plates, deformation, acoustic features, machine learning

Procedia PDF Downloads 323
8528 An Inverse Approach for Determining Creep Properties from a Miniature Thin Plate Specimen under Bending

Authors: Yang Zheng, Wei Sun

Abstract:

This paper describes a new approach which can be used to interpret the experimental creep deformation data obtained from miniaturized thin plate bending specimen test to the corresponding uniaxial data based on an inversed application of the reference stress method. The geometry of the thin plate is fully defined by the span of the support, l, the width, b, and the thickness, d. Firstly, analytical solutions for the steady-state, load-line creep deformation rate of the thin plates for a Norton’s power law under plane stress (b → 0) and plane strain (b → ∞) conditions were obtained, from which it can be seen that the load-line deformation rate of the thin plate under plane-stress conditions is much higher than that under the plane-strain conditions. Since analytical solution is not available for the plates with random b-values, finite element (FE) analyses are used to obtain the solutions. Based on the FE results obtained for various b/l ratios and creep exponent, n, as well as the analytical solutions under plane stress and plane strain conditions, an approximate, numerical solutions for the deformation rate are obtained by curve fitting. Using these solutions, a reference stress method is utilised to establish the conversion relationships between the applied load and the equivalent uniaxial stress and between the creep deformations of thin plate and the equivalent uniaxial creep strains. Finally, the accuracy of the empirical solution was assessed by using a set of “theoretical” experimental data.

Keywords: bending, creep, thin plate, materials engineering

Procedia PDF Downloads 463
8527 Production Increase of C-Central Wells Baher Essalm-Libya

Authors: Emed Krekshi, Walid Ben Husein

Abstract:

The Bahr Essalam gas-condensate field is located off the Libyan coast and is currently being produced by Mellitah Oil and Gas (MOG). Gas and condensate are produced from the Bahr Essalam reservoir through a mixture of platform and subsea wells, with the subsea wells being gathered at the western manifolds and delivered to the Sabratha platform via a 22-inch pipeline. Gas is gathered and dehydrated on the Sabratha platform and then delivered to the Mellitah gas plant via an existing 36-inch gas export pipeline. The condensate separated on the Sabratha platform will be delivered to the Mellitah gas plant via an existing 10-inch export pipeline. The Bahr Essalam Phase II project includes 2 production wells (CC16 & CC17) at C-Central A connected to the Sabratha platform via a new 10.9 km long 10”/14” production pipeline. Production rates from CC16 and CC17 have exceeded the maximum planned rate of 40 MMSCFD per well. A hydrothermal analysis was conducted to review and Verify input data, focusing on the variation of flowing well head as a function of flowrate.as well as Review available input data against the previous design input data to determine the extent of change. The steady-state and transient simulations performed with Olga yielded coherent results and confirmed the possibility of achieving flow rates of up to 60MMSCFD per well without exceeding the design temperatures, pressures, and velocities.

Keywords: Bahr Essalam, Mellitah Oil and Gas, production flow rates, steady and transient

Procedia PDF Downloads 40
8526 Support of Knowledge Sharing in Manufacturing Companies: A Case Study

Authors: Zuzana Crhová, Karel Kolman, Drahomíra Pavelková

Abstract:

Knowledge is considered as an important asset which can help organizations to create competitive advantage. The necessity of taking care of these assets is more important in these days – in days of turbulent changes in business environment. Knowledge could facilitate adaption to constant changes. The aim of this paper is to describe how the knowledge sharing can be supported in the manufacturing companies. The methods of case studies and grounded theory were used to present information gained by carrying out semi-structured interviews. Results show that knowledge sharing is supported in very similar ways in respondent companies.

Keywords: case study, human resource management, knowledge, knowledge sharing

Procedia PDF Downloads 435
8525 Factor Influencing the Certification to ISO 9000:2008 among SME in Malaysia

Authors: Dolhadi Bin Zainudin

Abstract:

The study attempts to predict the relationship between influencing factors in the adoption of ISO 9000:2008 and to identify which how these factors play the main role in achieving ISO 9000 standard. A survey using structured questionnaire was employed. A total of 255 respondents from 255 small and medium enterprises participated in this study. With regards to influencing factors, a discriminant analysis was conducted and the results showed that three out of nine critical success factors is statistically significant between ISO 9000:2008 and non-ISO 9000 certified companies which are communication for quality, information and analysis and organizational culture.

Keywords: ISO 9000, quality management, factors, small and medium enterprise, Malaysia, influencing factors

Procedia PDF Downloads 324
8524 Value of Willingness to Pay for a Quality-Adjusted Life Years Gained in Iran; A Modified Chained-Approach

Authors: Seyedeh-Fariba Jahanbin, Hasan Yusefzadeh, Bahram Nabilou, Cyrus Alinia, Cyrus Alinia

Abstract:

Background: Due to the lack of a constant Willingness to Pay per one additional Quality Adjusted Life Years gained based on the preferences of Iran’s general public, the cost-efectiveness of health system interventions is unclear and making it challenging to apply economic evaluation to health resources priority setting. Methods: We have measured this cost-efectiveness threshold with the participation of 2854 individuals from fve provinces, each representing an income quintile, using a modifed Time Trade-Of-based Chained-Approach. In this online-based empirical survey, to extract the health utility value, participants were randomly assigned to one of two green (21121) and yellow (22222) health scenarios designed based on the earlier validated EQ-5D-3L questionnaire. Results: Across the two health state versions, mean values for one QALY gain (rounded) ranged from $6740-$7400 and $6480-$7120, respectively, for aggregate and trimmed models, which are equivalent to 1.35-1.18 times of the GDP per capita. Log-linear Multivariate OLS regression analysis confrmed that respondents were more likely to pay if their income, disutility, and education level were higher than their counterparts. Conclusions: In the health system of Iran, any intervention that is with the incremental cost-efectiveness ratio, equal to and less than 7402.12 USD, will be considered cost-efective.

Keywords: willingness to Pay, QALY, chained-approach, cost-efectiveness threshold, Iran

Procedia PDF Downloads 74
8523 Web Development in Information Technology with Javascript, Machine Learning and Artificial Intelligence

Authors: Abdul Basit Kiani, Maryam Kiani

Abstract:

Online developers now have the tools necessary to create online apps that are not only reliable but also highly interactive, thanks to the introduction of JavaScript frameworks and APIs. The objective is to give a broad overview of the recent advances in the area. The fusion of machine learning (ML) and artificial intelligence (AI) has expanded the possibilities for web development. Modern websites now include chatbots, clever recommendation systems, and customization algorithms built in. In the rapidly evolving landscape of modern websites, it has become increasingly apparent that user engagement and personalization are key factors for success. To meet these demands, websites now incorporate a range of innovative technologies. One such technology is chatbots, which provide users with instant assistance and support, enhancing their overall browsing experience. These intelligent bots are capable of understanding natural language and can answer frequently asked questions, offer product recommendations, and even help with troubleshooting. Moreover, clever recommendation systems have emerged as a powerful tool on modern websites. By analyzing user behavior, preferences, and historical data, these systems can intelligently suggest relevant products, articles, or services tailored to each user's unique interests. This not only saves users valuable time but also increases the chances of conversions and customer satisfaction. Additionally, customization algorithms have revolutionized the way websites interact with users. By leveraging user preferences, browsing history, and demographic information, these algorithms can dynamically adjust the website's layout, content, and functionalities to suit individual user needs. This level of personalization enhances user engagement, boosts conversion rates, and ultimately leads to a more satisfying online experience. In summary, the integration of chatbots, clever recommendation systems, and customization algorithms into modern websites is transforming the way users interact with online platforms. These advanced technologies not only streamline user experiences but also contribute to increased customer satisfaction, improved conversions, and overall website success.

Keywords: Javascript, machine learning, artificial intelligence, web development

Procedia PDF Downloads 65
8522 Solution of Logistics Center Selection Problem Using the Axiomatic Design Method

Authors: Fulya Zaralı, Harun Resit Yazgan

Abstract:

Logistics centers represent areas that all national and international logistics and activities related to logistics can be implemented by the various businesses. Logistics centers have a key importance in joining the transport stream and the transport system operations. Therefore, it is important where these centers are positioned to be effective and efficient and to show the expected performance of the centers. In this study, the location selection problem to position the logistics center is discussed. Alternative centers are evaluated according certain criteria. The most appropriate center is identified using the axiomatic design method.

Keywords: axiomatic design, logistic center, facility location, information systems

Procedia PDF Downloads 338
8521 Crime Victim Support Services in Bangladesh: An Analysis

Authors: Mohammad Shahjahan, Md. Monoarul Haque

Abstract:

In the research work information and data were collected from both types of sources, direct and indirect. Numerological, qualitative and participatory analysis methods have been followed. There were two principal sources of collecting information and data. Firstly, the data provided by the service recipients (300 nos. of women and children victims) in the Victim Support Centre and service providing policemen, executives and staffs (60 nos.). Secondly, data collected from Specialists, Criminologists and Sociologists involved in victim support services through Consultative Interview, KII, Case Study and FGD etc. The initial data collection has been completed with the help of questionnaires as per strategic variations and with the help of guidelines. It is to be noted that the main objective of this research was to determine whether services provided to the victims for their facilities, treatment/medication and rehabilitation by different government/non-government organizations was veritable at all. At the same time socio-economic background and demographic characteristics of the victims have also been revealed through this research. The results of the study show that although the number of victims has increased gradually due to socio-economic, political and cultural realities in Bangladesh, the number of victim support centers has not increased as expected. Awareness among the victims about the effectiveness of the 8 centers working in this regard is also not up to the mark. Two thirds of the victims coming to get service were not cognizant regarding the victim support services at all before getting the service. Most of those who have finally been able to come under the services of the Victim Support Center through various means, have received sheltering (15.5%), medical services (13.32%), counseling services (13.10%) and legal aid (12.66%). The opportunity to stay in security custody and psycho-physical services were also notable. Usually, women and children from relatively poor and marginalized families of the society come to victim support center for getting services. Among the women, young unmarried women are the biggest victims of crime. Again, women and children employed as domestic workers are more affected. A number of serious negative impacts fall on the lives of the victims. Being deprived of employment opportunities (26.62%), suffering from psycho-somatic disorder (20.27%), carrying sexually transmitted diseases (13.92%) are among them. It seems apparent to urgently enact distinct legislation, increase the number of Victim Support Centers, expand the area and purview of services and take initiative to increase public awareness and to create mass movement.

Keywords: crime, victim, support, Bangladesh

Procedia PDF Downloads 76
8520 Climate Changes Impact on Artificial Wetlands

Authors: Carla Idely Palencia-Aguilar

Abstract:

Artificial wetlands play an important role at Guasca Municipality in Colombia, not only because they are used for the agroindustry, but also because more than 45 species were found, some of which are endemic and migratory birds. Remote sensing was used to determine the changes in the area occupied by water of artificial wetlands by means of Aster and Modis images for different time periods. Evapotranspiration was also determined by three methods: Surface Energy Balance System-Su (SEBS) algorithm, Surface Energy Balance- Bastiaanssen (SEBAL) algorithm, and Potential Evapotranspiration- FAO. Empirical equations were also developed to determine the relationship between Normalized Difference Vegetation Index (NDVI) versus net radiation, ambient temperature and rain with an obtained R2 of 0.83. Groundwater level fluctuations on a daily basis were studied as well. Data from a piezometer placed next to the wetland were fitted with rain changes (with two weather stations located at the proximities of the wetlands) by means of multiple regression and time series analysis, the R2 from the calculated and measured values resulted was higher than 0.98. Information from nearby weather stations provided information for ordinary kriging as well as the results for the Digital Elevation Model (DEM) developed by using PCI software. Standard models (exponential, spherical, circular, gaussian, linear) to describe spatial variation were tested. Ordinary Cokriging between height and rain variables were also tested, to determine if the accuracy of the interpolation would increase. The results showed no significant differences giving the fact that the mean result of the spherical function for the rain samples after ordinary kriging was 58.06 and a standard deviation of 18.06. The cokriging using for the variable rain, a spherical function; for height variable, the power function and for the cross variable (rain and height), the spherical function had a mean of 57.58 and a standard deviation of 18.36. Threatens of eutrophication were also studied, given the unconsciousness of neighbours and government deficiency. Water quality was determined over the years; different parameters were studied to determine the chemical characteristics of water. In addition, 600 pesticides were studied by gas and liquid chromatography. Results showed that coliforms, nitrogen, phosphorous and prochloraz were the most significant contaminants.

Keywords: DEM, evapotranspiration, geostatistics, NDVI

Procedia PDF Downloads 109
8519 Community Structure Detection in Networks Based on Bee Colony

Authors: Bilal Saoud

Abstract:

In this paper, we propose a new method to find the community structure in networks. Our method is based on bee colony and the maximization of modularity to find the community structure. We use a bee colony algorithm to find the first community structure that has a good value of modularity. To improve the community structure, that was found, we merge communities until we get a community structure that has a high value of modularity. We provide a general framework for implementing our approach. We tested our method on computer-generated and real-world networks with a comparison to very known community detection methods. The obtained results show the effectiveness of our proposition.

Keywords: bee colony, networks, modularity, normalized mutual information

Procedia PDF Downloads 387