Search results for: non-real-time data
1387 Deterioration Assessment Models for Water Pipelines
Authors: L. Parvizsedghy, I. Gkountis, A. Senouci, T. Zayed, M. Alsharqawi, H. El Chanati, M. El-Abbasy, F. Mosleh
Abstract:
The aging and deterioration of water pipelines in cities worldwide result in more frequent water main breaks, water service disruptions, and flooding damage. Therefore, there is an urgent need for undertaking proper maintenance procedures to avoid breaks and disastrous failures. However, due to budget limitations, the maintenance of water pipeline networks needs to be prioritized through efficient deterioration assessment models. Previous studies focused on the development of structural or physical deterioration assessment models, which require expensive inspection data. But, this paper aims at developing deterioration assessment models for water pipelines using statistical techniques. Several deterioration models were developed based on pipeline size, material type, and soil type using linear regression analysis. The categorical nature of some variables affecting pipeline deterioration was considered through developing several categorical models. The developed models were validated with an average validity percentage greater than 95%. Moreover, sensitivity analysis was carried out against different classifications and it displayed higher importance of age of pipes compared to other factors. The developed models will be helpful for the water municipalities and asset managers to assess the condition of their pipes and prioritize them for maintenance and inspection purposes.
Keywords: Water pipelines, deterioration assessment models, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12001386 Study on the Impact of Size and Position of the Shear Field in Determining the Shear Modulus of Glulam Beam Using Photogrammetry Approach
Authors: Niaz Gharavi, Hexin Zhang
Abstract:
The shear modulus of a timber beam can be determined using torsion test or shear field test method. The shear field test method is based on shear distortion measurement of the beam at the zone with the constant transverse load in the standardized four-point bending test. The current code of practice advises using two metallic arms act as an instrument to measure the diagonal displacement of the constructing square. The size and the position of the constructing square might influence the shear modulus determination. This study aimed to investigate the size and the position effect of the square in the shear field test method. A binocular stereo vision system has been employed to determine the 3D displacement of a grid of target points. Six glue laminated beams were produced and tested. Analysis of Variance (ANOVA) was performed on the acquired data to evaluate the significance of the size effect and the position effect of the square. The results have shown that the size of the square has a noticeable influence on the value of shear modulus, while, the position of the square within the area with the constant shear force does not affect the measured mean shear modulus.Keywords: Shear field test method, structural-sized test, shear modulus of Glulam beam, photogrammetry approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10091385 Traffic Signal Design and Simulation for Vulnerable Road Users Safety and Bus Preemption
Authors: Shih-Ching Lo, Hsieh-Chu Huang
Abstract:
Mostly, pedestrian-car accidents occurred at a signalized interaction is because pedestrians cannot across the intersection safely within the green light. From the viewpoint of pedestrian, there might have two reasons. The first one is pedestrians cannot speed up to across the intersection, such as the elders. The other reason is pedestrians do not sense that the signal phase is going to change and their right-of-way is going to lose. Developing signal logic to protect pedestrian, who is crossing an intersection is the first purpose of this study. Another purpose of this study is improving the reliability and reduce delay of public transportation service. Therefore, bus preemption is also considered in the designed signal logic. In this study, the traffic data of the intersection of Chong-Qing North Road and Min-Zu West Road, Taipei, Taiwan, is employed to calibrate and validate the signal logic by simulation. VISSIM 5.20, which is a microscopic traffic simulation software, is employed to simulate the signal logic. From the simulated results, the signal logic presented in this study can protect pedestrians crossing the intersection successfully. The design of bus preemption can reduce the average delay. However, the pedestrian safety and bus preemptive signal will influence the average delay of cars largely. Thus, whether applying the pedestrian safety and bus preemption signal logic to an isolated intersection or not should be evaluated carefully.Keywords: vulnerable road user, bus preemption, signal design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16731384 Stochastic Modeling for Parameters of Modified Car-Following Model in Area-Based Traffic Flow
Authors: N. C. Sarkar, A. Bhaskar, Z. Zheng
Abstract:
The driving behavior in area-based (i.e., non-lane based) traffic is induced by the presence of other individuals in the choice space from the driver’s visual perception area. The driving behavior of a subject vehicle is constrained by the potential leaders and leaders are frequently changed over time. This paper is to determine a stochastic model for a parameter of modified intelligent driver model (MIDM) in area-based traffic (as in developing countries). The parametric and non-parametric distributions are presented to fit the parameters of MIDM. The goodness of fit for each parameter is measured in two different ways such as graphically and statistically. The quantile-quantile (Q-Q) plot is used for a graphical representation of a theoretical distribution to model a parameter and the Kolmogorov-Smirnov (K-S) test is used for a statistical measure of fitness for a parameter with a theoretical distribution. The distributions are performed on a set of estimated parameters of MIDM. The parameters are estimated on the real vehicle trajectory data from India. The fitness of each parameter with a stochastic model is well represented. The results support the applicability of the proposed modeling for parameters of MIDM in area-based traffic flow simulation.
Keywords: Area-based traffic, car-following model, micro-simulation, stochastic modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7741383 CPT Pore Water Pressure Correlations with PDA to Identify Pile Drivability Problem
Authors: Fauzi Jarushi, Paul Cosentino, Edward Kalajian, Hadeel Dekhn
Abstract:
At certain depths during large diameter displacement pile driving, rebound well over 0.25 inches was experienced, followed by a small permanent-set during each hammer blow. High pile rebound (HPR) soils may stop the pile driving and results in a limited pile capacity. In some cases, rebound leads to pile damage, delaying the construction project, and the requiring foundations redesign. HPR was evaluated at seven Florida sites, during driving of square precast, prestressed concrete piles driven into saturated, fine silty to clayey sands and sandy clays. Pile Driving Analyzer (PDA) deflection versus time data recorded during installation, was used to develop correlations between cone penetrometer (CPT) pore-water pressures, pile displacements and rebound. At five sites where piles experienced excessive HPR with minimal set, the pore pressure yielded very high positive values of greater than 20 tsf. However, at the site where the pile rebounded, followed by an acceptable permanent-set, the measured pore pressure ranged between 5 and 20 tsf. The pore pressure exhibited values of less than 5 tsf at the site where no rebound was noticed. In summary, direct correlations between CPTu pore pressure and rebound were produced, allowing identification of soils that produce HPR.
Keywords: CPTu, pore water pressure, pile rebound.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26711382 Rule Based Architecture for Collaborative Multidisciplinary Aircraft Design Optimisation
Authors: Nickolay Jelev, Andy Keane, Carren Holden, András Sóbester
Abstract:
In aircraft design, the jump from the conceptual to preliminary design stage introduces a level of complexity which cannot be realistically handled by a single optimiser, be that a human (chief engineer) or an algorithm. The design process is often partitioned along disciplinary lines, with each discipline given a level of autonomy. This introduces a number of challenges including, but not limited to: coupling of design variables; coordinating disciplinary teams; handling of large amounts of analysis data; reaching an acceptable design within time constraints. A number of classical Multidisciplinary Design Optimisation (MDO) architectures exist in academia specifically designed to address these challenges. Their limited use in the industrial aircraft design process has inspired the authors of this paper to develop an alternative strategy based on well established ideas from Decision Support Systems. The proposed rule based architecture sacrifices possibly elusive guarantees of convergence for an attractive return in simplicity. The method is demonstrated on analytical and aircraft design test cases and its performance is compared to a number of classical distributed MDO architectures.Keywords: Multidisciplinary design optimisation, rule based architecture, aircraft design, decision support system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10711381 Comparative Study of Evolutionary Model and Clustering Methods in Circuit Partitioning Pertaining to VLSI Design
Authors: K. A. Sumitra Devi, N. P. Banashree, Annamma Abraham
Abstract:
Partitioning is a critical area of VLSI CAD. In order to build complex digital logic circuits its often essential to sub-divide multi -million transistor design into manageable Pieces. This paper looks at the various partitioning techniques aspects of VLSI CAD, targeted at various applications. We proposed an evolutionary time-series model and a statistical glitch prediction system using a neural network with selection of global feature by making use of clustering method model, for partitioning a circuit. For evolutionary time-series model, we made use of genetic, memetic & neuro-memetic techniques. Our work focused in use of clustering methods - K-means & EM methodology. A comparative study is provided for all techniques to solve the problem of circuit partitioning pertaining to VLSI design. The performance of all approaches is compared using benchmark data provided by MCNC standard cell placement benchmark net lists. Analysis of the investigational results proved that the Neuro-memetic model achieves greater performance then other model in recognizing sub-circuits with minimum amount of interconnections between them.
Keywords: VLSI, circuit partitioning, memetic algorithm, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16391380 The Enhancement of Training of Military Pilots Using Psychophysiological Methods
Authors: G. Kloudova, M. Stehlik
Abstract:
Optimal human performance is a key goal in the professional setting of military pilots, which is a highly challenging atmosphere. The aviation environment requires substantial cognitive effort and is rich in potential stressors. Therefore, it is important to analyze variables such as mental workload to ensure safe conditions. Pilot mental workload could be measured using several tools, but most of them are very subjective. This paper details research conducted with military pilots using psychophysiological methods such as electroencephalography (EEG) and heart rate (HR) monitoring. The data were measured in a simulator as well as under real flight conditions. All of the pilots were exposed to highly demanding flight tasks and showed big individual response differences. On that basis, the individual pattern for each pilot was created counting different EEG features and heart rate variations. Later on, it was possible to distinguish the most difficult flight tasks for each pilot that should be more extensively trained. For training purposes, an application was developed for the instructors to decide which of the specific tasks to focus on during follow-up training. This complex system can help instructors detect the mentally demanding parts of the flight and enhance the training of military pilots to achieve optimal performance.
Keywords: Cognitive effort, human performance, military pilots, psychophysiological methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11811379 An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing
Authors: Raksha Sharma, Vishnu Kant Soni, Manoj Kumar Mishra, Prachet Bhuyan, Utpal Chandra Dey
Abstract:
Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.Keywords: Agent, Grid Computing, Job Grouping, Max Heap Tree (MHT), Resource Scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20901378 Discussion about Frequent Adjustment of Urban Master Planning in China: A Case Study of Changshou District, Chongqing City
Authors: Sun Ailu, Zhao Wanmin
Abstract:
Since the reform and opening, the urbanization process of China has entered a rapid development period. In recent years, the authors participated in some projects of urban master planning in China and found a phenomenon that the rapid urbanization area of China is experiencing frequent adjustment process of urban master planning. This phenomenon is not the natural process of urbanization development. It may be caused by different government roles from different levels. Through the methods of investigation, data comparison and case study, this paper aims to explore the reason why the rapid urbanization area is experiencing frequent adjustment of master planning and give some solution strategies. Firstly, taking Changshou district of Chongqing city as an example, this paper wants to introduce the phenomenon about frequent adjustment process in China. And then, discuss distinct roles in the process between national government, provincial government and local government of China. At last, put forward preliminary solutions strategies for this area in China from the aspects of land use, intergovernmental cooperation and so on.
Keywords: Urban master planning, frequent adjustment, urbanization development, problems and strategies, China.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10901377 The Profitability Management Mechanism of Leather Industry-Based on the Activity-Based Benefit Approach
Authors: Mei-Fang Wu, Shu-Li Wang, Tsung-Yueh Lu, Feng-Tsung Cheng
Abstract:
Strengthening core competitiveness is the main goal of enterprises in a fierce competitive environment. Accurate cost information is a great help for managers in dealing with operation strategies. This paper establishes a profitability management mechanism that applies the Activity-Based Benefit approach (ABBA) to solve the profitability for each customer from the market. ABBA provides financial and non-financial information for the operation, but also indicates what resources have expired in the operational process. The customer profit management model shows the level of profitability of each customer for the company. The empirical data were gathered from a case company operating in the leather industry in Taiwan. The research findings indicate that 30% of customers create little profit for the company as a result of asking for over 5% of sales discounts. Those customers ask for sales discount because of color differences of leather products. This paper provides a customer’s profitability evaluation mechanism to help enterprises to greatly improve operating effectiveness and promote operational activity efficiency and overall operation profitability.
Keywords: Activity-based benefit approach, customer profit analysis, leather industry, profitability management mechanism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9561376 Detection of Cyberattacks on the Metaverse Based on First-Order Logic
Authors: Sulaiman Al Amro
Abstract:
There are currently considerable challenges concerning data security and privacy, particularly in relation to modern technologies. This includes the virtual world known as the Metaverse, which consists of a virtual space that integrates various technologies, and therefore susceptible to cyber threats such as malware, phishing, and identity theft. This has led recent studies to propose the development of Metaverse forensic frameworks and the integration of advanced technologies, including machine learning for intrusion detection and security. In this context, the application of first-order logic offers a formal and systematic approach to defining the conditions of cyberattacks, thereby contributing to the development of effective detection mechanisms. In addition, formalizing the rules and patterns of cyber threats has the potential to enhance the overall security posture of the Metaverse and thus the integrity and safety of this virtual environment. The current paper focuses on the primary actions employed by avatars for potential attacks, including Interval Temporal Logic (ITL) and behavior-based detection to detect an avatar’s abnormal activities within the Metaverse. The research established that the proposed framework attained an accuracy of 92.307%, resulting in the experimental results demonstrating the efficacy of ITL, including its superior performance in addressing the threats posed by avatars within the Metaverse domain.
Keywords: Cyberattacks, detection, first-order logic, Metaverse, privacy, security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 671375 Modeling Oxygen-transfer by Multiple Plunging Jets using Support Vector Machines and Gaussian Process Regression Techniques
Authors: Surinder Deswal
Abstract:
The paper investigates the potential of support vector machines and Gaussian process based regression approaches to model the oxygen–transfer capacity from experimental data of multiple plunging jets oxygenation systems. The results suggest the utility of both the modeling techniques in the prediction of the overall volumetric oxygen transfer coefficient (KLa) from operational parameters of multiple plunging jets oxygenation system. The correlation coefficient root mean square error and coefficient of determination values of 0.971, 0.002 and 0.945 respectively were achieved by support vector machine in comparison to values of 0.960, 0.002 and 0.920 respectively achieved by Gaussian process regression. Further, the performances of both these regression approaches in predicting the overall volumetric oxygen transfer coefficient was compared with the empirical relationship for multiple plunging jets. A comparison of results suggests that support vector machines approach works well in comparison to both empirical relationship and Gaussian process approaches, and could successfully be employed in modeling oxygen-transfer.Keywords: Oxygen-transfer, multiple plunging jets, support vector machines, Gaussian process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16391374 CFD Simulation and Validation of Flow Pattern Transition Boundaries during Moderately Viscous Oil-Water Two-Phase Flow through Horizontal Pipeline
Authors: Anand B. Desamala, Anjali Dasari, Vinayak Vijayan, Bharath K. Goshika, Ashok K. Dasmahapatra, Tapas K. Mandal
Abstract:
In the present study, computational fluid dynamics (CFD) simulation has been executed to investigate the transition boundaries of different flow patterns for moderately viscous oil-water (viscosity ratio 107, density ratio 0.89 and interfacial tension of 0.032 N/m.) two-phase flow through a horizontal pipeline with internal diameter and length of 0.025 m and 7.16 m respectively. Volume of Fluid (VOF) approach including effect of surface tension has been employed to predict the flow pattern. Geometry and meshing of the present problem has been drawn using GAMBIT and ANSYS FLUENT has been used for simulation. A total of 47037 quadrilateral elements are chosen for the geometry of horizontal pipeline. The computation has been performed by assuming unsteady flow, immiscible liquid pair, constant liquid properties, co-axial flow and a T-junction as entry section. The simulation correctly predicts the transition boundaries of wavy stratified to stratified mixed flow. Other transition boundaries are yet to be simulated. Simulated data has been validated with our own experimental results.Keywords: CFD simulation, flow pattern transition, moderately viscous oil-water flow, prediction of flow transition boundary, VOF technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42501373 Improved Blood Glucose-Insulin Monitoring with Dual-Layer Predictive Control Design
Authors: Vahid Nademi
Abstract:
In response to widely used wearable medical devices equipped with a continuous glucose monitor (CGM) and insulin pump, the advanced control methods are still demanding to get the full benefit of these devices. Unlike costly clinical trials, implementing effective insulin-glucose control strategies can provide significant contributions to the patients suffering from chronic diseases such as diabetes. This study deals with a key role of two-layer insulin-glucose regulator based on model-predictive-control (MPC) scheme so that the patient’s predicted glucose profile is in compliance with the insulin level injected through insulin pump automatically. It is achieved by iterative optimization algorithm which is called an integrated perturbation analysis and sequential quadratic programming (IPA-SQP) solver for handling uncertainties due to unexpected variations in glucose-insulin values and body’s characteristics. The feasibility evaluation of the discussed control approach is also studied by means of numerical simulations of two case scenarios via measured data. The obtained results are presented to verify the superior and reliable performance of the proposed control scheme with no negative impact on patient safety.
Keywords: Blood glucose monitoring, insulin pump, optimization, predictive control, diabetes disease.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7491372 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation
Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan
Abstract:
Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.
Keywords: Binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13741371 Experimental Investigation on Residual Stresses in Welded Medium-Walled I-shaped Sections Fabricated from Q460GJ Structural Steel Plates
Authors: Qian Zhu, Shidong Nie, Bo Yang, Gang Xiong, Guoxin Dai
Abstract:
GJ steel is a new type of high-performance structural steel which has been increasingly adopted in practical engineering. Q460GJ structural steel has a nominal yield strength of 460 MPa, which does not decrease significantly with the increase of steel plate thickness like normal structural steel. Thus, Q460GJ structural steel is normally used in medium-walled welded sections. However, research works on the residual stress in GJ steel members are few though it is one of the vital factors that can affect the member and structural behavior. This article aims to investigate the residual stresses in welded I-shaped sections fabricated from Q460GJ structural steel plates by experimental tests. A total of four full scale welded medium-walled I-shaped sections were tested by sectioning method. Both circular curve correction method and straightening measurement method were adopted in this study to obtain the final magnitude and distribution of the longitudinal residual stresses. In addition, this paper also explores the interaction between flanges and webs. And based on the statistical evaluation of the experimental data, a multilayer residual stress model is proposed.
Keywords: Q460GJ structural steel, residual stresses, sectioning method, Welded medium-walled I-shaped sections.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10561370 Cyber Security Situational Awareness among Students: A Case Study in Malaysia
Authors: Yunos Zahri, Ab Hamid R. Susanty, Ahmad Mustaffa
Abstract:
This paper explores the need for a national baseline study on understanding the level of cyber security situational awareness among primary and secondary school students in Malaysia. The online survey method was deployed to administer the data collection exercise. The target groups were divided into three categories: Group 1 (primary school aged 7-9 years old), Group 2 (primary school aged 10-12 years old), and Group 3 (secondary school aged 13-17 years old). A different questionnaire set was designed for each group. The survey topics/areas included Internet and digital citizenship knowledge. Respondents were randomly selected from rural and urban areas throughout all 14 states in Malaysia. A total of 9,158 respondents participated in the survey, with most states meeting the minimum sample size requirement to represent the country’s demographics. The findings and recommendations from this baseline study are fundamental to develop teaching modules required for children to understand the security risks and threats associated with the Internet throughout their years in school. Early exposure and education will help ensure healthy cyber habits among millennials in Malaysia.
Keywords: Cyber security awareness, cyber security education, cyber security, students.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29861369 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks
Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar
Abstract:
DNA Barcode provides good sources of needed information to classify living species. The classification problem has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use the similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. However, all the used methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. In fact, our method permits to avoid the complex problem of form and structure in different classes of organisms. The empirical data and their classification performances are compared with other methods. Evenly, in this study, we present our system which is consisted of three phases. The first one, is called transformation, is composed of three sub steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. Moreover, the second phase step is an approximation; it is empowered by the use of Multi Library Wavelet Neural Networks (MLWNN). Finally, the third one, is called the classification of DNA Barcodes, is realized by applying the algorithm of hierarchical classification.Keywords: DNA Barcode, Electron-Ion Interaction Pseudopotential, Multi Library Wavelet Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19671368 A Follow–Up Study of Bachelor of Science Graduates in Applied Statistics from Suan Sunandha Rajabhat University during the 1999-2012 Academic Years
Authors: Somruedee Pongsena
Abstract:
The purpose of this study is to follow – up the graduated students of Bachelor of Science in Applied Statistics from Suan Sunandha Rajabhat University (SSRU) during the 1999 – 2012 academic years and to provide the fundamental guideline for developing the current curriculum according to Thai Qualifications Framework for Higher Education (TQF: HEd). The sample was collected from 75 graduates by interview and online questionnaire. The content covered 5 subjects were Ethics and Moral, Knowledge, Cognitive Skills, Interpersonal Skill and Responsibility, Numerical Analysis as well as Communication and Information Technology Skills. Data were analyzed by using statistical methods as percentiles, means, standard deviation, t- tests, and F- tests. The findings showed that samples were mostly female had less than 26 years old. The majority of graduates had income in the range of 10,001-20,000 Baht and experience range were 2-5 years. In addition, overall opinions from receiving knowledge to apply to work were at agree; mean score was 3.97 and standard deviation was 0.40. In terms of, the hypothesis testing’s result indicate gender only had different opinion at a significance level of 0.05.
Keywords: Follow up, Graduates, knowledge, opinion, Work performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14571367 Motivating Factors of Couple Involvement in Copreneurship Businesses in Malaysia
Authors: Norasmah Othman, Suzana Mohamed, Salpiah Suradi
Abstract:
Copreneurship is a term used to describe the business pattern of operations run by married couples who share commitment, goals, and responsibilities in handling a business. Research conducted overseas showed that copreneurship business activities grew quickly and played a role in elevating families’ and nations’ socio-economic standards. In Malaysia, copreneurship has long been cultivated by spouses. Thus, this study aimed to explore the factors that motivate married partners to start a copreneurship business, and who is the dominant partner in the management of this business. The study participants are four entrepreneurial couples who are SME business operators selected through purposive sampling. In-depth interviews and direct observation were used as methods of measurement for triangulation of qualitative data in this study. The findings of the interviews were administered using NVivo 8.0 software. The result shows that freedom is a key factor that drives entrepreneurs to set up copreneurship businesses, and that the husband dominates the management aspects of the business. The study gives an overview of the parties involved in entrepreneurship to provide understanding of the copreneurship concept as it is practiced. This study provides academic value by creating understanding of the importance of a harmonious family institution specifically for forming entrepreneurs in the familial environment in Malaysia.Keywords: Copreneurs, copreneurship, business management, enterprise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27221366 A Bayesian Kernel for the Prediction of Protein- Protein Interactions
Authors: Hany Alashwal, Safaai Deris, Razib M. Othman
Abstract:
Understanding proteins functions is a major goal in the post-genomic era. Proteins usually work in context of other proteins and rarely function alone. Therefore, it is highly relevant to study the interaction partners of a protein in order to understand its function. Machine learning techniques have been widely applied to predict protein-protein interactions. Kernel functions play an important role for a successful machine learning technique. Choosing the appropriate kernel function can lead to a better accuracy in a binary classifier such as the support vector machines. In this paper, we describe a Bayesian kernel for the support vector machine to predict protein-protein interactions. The use of Bayesian kernel can improve the classifier performance by incorporating the probability characteristic of the available experimental protein-protein interactions data that were compiled from different sources. In addition, the probabilistic output from the Bayesian kernel can assist biologists to conduct more research on the highly predicted interactions. The results show that the accuracy of the classifier has been improved using the Bayesian kernel compared to the standard SVM kernels. These results imply that protein-protein interaction can be predicted using Bayesian kernel with better accuracy compared to the standard SVM kernels.Keywords: Bioinformatics, Protein-protein interactions, Bayesian Kernel, Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21641365 An Angioplasty Intervention Simulator with a Specific Virtual Environment
Authors: G. Aloisio, L. T. De Paolis, A. De Mauro, A. Mongelli
Abstract:
One of the essential requirements of a realistic surgical simulator is to reproduce haptic sensations due to the interactions in the virtual environment. However, the interaction need to be performed in real-time, since a delay between the user action and the system reaction reduces the immersion sensation. In this paper, a prototype of a coronary stent implant simulator is present; this system allows real-time interactions with an artery by means of a specific haptic device. To improve the realism of the simulation, the building of the virtual environment is based on real patients- images and a Web Portal is used to search in the geographically remote medical centres a virtual environment with specific features in terms of pathology or anatomy. The functional architecture of the system defines several Medical Centres in which virtual environments built from the real patients- images and related metadata with specific features in terms of pathology or anatomy are stored. The searched data are downloaded from the Medical Centre to the Training Centre provided with a specific haptic device and with the software necessary both to manage the interaction in the virtual environment. After the integration of the virtual environment in the simulation system it is possible to perform training on the specific surgical procedure.Keywords: Medical Simulation, Web Portal, Virtual Reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17971364 Nonlinear Fuzzy Tracking Real-time-based Control of Drying Parameters
Authors: Marco Soares dos Santos, Camila Nicola Boeri, Jorge Augusto Ferreira, Fernando Neto da Silva
Abstract:
The highly nonlinear characteristics of drying processes have prompted researchers to seek new nonlinear control solutions. However, the relation between the implementation complexity, on-line processing complexity, reliability control structure and controller-s performance is not well established. The present paper proposes high performance nonlinear fuzzy controllers for a real-time operation of a drying machine, being developed under a consistent match between those issues. A PCI-6025E data acquisition device from National Instruments® was used, and the control system was fully designed with MATLAB® / SIMULINK language. Drying parameters, namely relative humidity and temperature, were controlled through MIMOs Hybrid Bang-bang+PI (BPI) and Four-dimensional Fuzzy Logic (FLC) real-time-based controllers to perform drying tests on biological materials. The performance of the drying strategies was compared through several criteria, which are reported without controllers- retuning. Controllers- performance analysis has showed much better performance of FLC than BPI controller. The absolute errors were lower than 8,85 % for Fuzzy Logic Controller, about three times lower than the experimental results with BPI control.Keywords: Drying control, Fuzzy logic control, Intelligent temperature-humidity control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23381363 The Long-Term Leaching Behaviour of 137Cs, 60Co and 152Eu Radionuclides Incorporated in Mortar Matrices Made from Natural Aggregates and Recycled Aggregates
Authors: R. Deju, M. Mincu, D. Gurau
Abstract:
During the interim storage or final disposal of low level waste, migration/diffusion of radionuclides can occur when the waste comes in contact with water. The long-term leaching behaviour into surrounding fluid (demineralized water) of 137Cs, 60Co and 152Eu radionuclides, artificially incorporated in mortar matrices made from natural aggregates (river sand) and recycled radioactive concrete was studied. Results presented in this work are obtained in two years of mortar testing and will be used for the safety increasing in the storage of low level radioactive waste. The study involved the influence of curing time, type and size distribution of the aggregates on leaching behaviour. The mortar samples were immersed in distilled water for 30 days. The leached activity of the mortar samples was measured on samples from the immersing water and analyzed through a gamma-ray spectrometry method using an HPGe detector with a GESPECOR code for efficiency evaluation. The long-term leaching behaviour of the radionuclides was evaluated from the leaching data calculating the apparent diffusion coefficient.Keywords: Leaching behaviour, recycling of radioactive concrete, waste management, gamma-ray spectrometry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11041362 Information Technology for Business Process Management in Insurance Companies
Authors: Vesna Bosilj Vukšić, Darija Ivandić Vidović, Ljubica Milanović Glavan
Abstract:
Information technology plays an irreplaceable role in introducing and improving business process orientation in a company. It enables implementation of the theoretical concept, measurement of results achieved and undertaking corrective measures aimed at improvements. Information technology is a key concept in the development and implementation of the business process management systems as it establishes a connection to business operations. Both in the literature and practice, insurance companies are often seen as highly process oriented due to the nature of their business and focus on customers. They are also considered leaders in using information technology for business process management. The research conducted aimed to investigate whether the perceived leadership status of insurance companies is well deserved, i.e. to establish the level of process orientation and explore the practice of information technology use in insurance companies in the region. The main instrument for primary data collection within this research was an electronic survey questionnaire sent to the management of insurance companies in the Republic of Croatia, Bosnia and Herzegovina, Slovenia, Serbia and Macedonia. The conducted research has shown that insurance companies have a satisfactory level of process orientation, but that there is also a huge potential for improvement, especially in the segment of information technology and its connection to business processes.Keywords: Business processes management, process orientation, information technology, insurance companies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24571361 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment
Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg
Abstract:
Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.
Keywords: Building Information Modelling, BIM, Genetic Algorithm, GA, architecture-engineering-construction, AEC, Optimisation, structure, design, population, generation, selection, mutation, crossover, offspring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8221360 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: Bootstrap, Edgeworth approximation, independent and Identical distributed, quantile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4411359 Scatter Analysis of Fatigue Life and Pore Size Data of Die-Cast AM60B Magnesium Alloy
Authors: S. Mohd, Y. Mutoh, Y. Otsuka, Y. Miyashita, T. Koike, T. Suzuki
Abstract:
Scatter behavior of fatigue life in die-cast AM60B alloy was investigated. For comparison, those in rolled AM60B alloy and die-cast A365-T5 aluminum alloy were also studied. Scatter behavior of pore size was also investigated to discuss dominant factors for fatigue life scatter in die-cast materials. Three-parameter Weibull function was suitable to explain the scatter behavior of both fatigue life and pore size. The scatter of fatigue life in die-cast AM60B alloy was almost comparable to that in die-cast A365-T5 alloy, while it was significantly large compared to that in the rolled AM60B alloy. Scatter behavior of pore size observed at fracture nucleation site on the fracture surface was comparable to that observed on the specimen cross-section and also to that of fatigue life. Therefore, the dominant factor for large scatter of fatigue life in die-cast alloys would be the large scatter of pore size. This speculation was confirmed by the fracture mechanics fatigue life prediction, where the pore observed at fatigue crack nucleation site was assumed as the pre-existing crack.Keywords: Fatigue life, Pore size, Scatter, Weibull distribution, Die-cast magnesium alloy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23921358 Fuzzy Inference System for Determining Collision Risk of Ship in Madura Strait Using Automatic Identification System
Authors: Emmy Pratiwi, Ketut B. Artana, A. A. B. Dinariyana
Abstract:
Madura Strait is considered as one of the busiest shipping channels in Indonesia. High vessel traffic density in Madura Strait gives serious threat due to navigational safety in this area, i.e. ship collision. This study is necessary as an attempt to enhance the safety of marine traffic. Fuzzy inference system (FIS) is proposed to calculate risk collision of ships. Collision risk is evaluated based on ship domain, Distance to Closest Point of Approach (DCPA), and Time to Closest Point of Approach (TCPA). Data were collected by utilizing Automatic Identification System (AIS). This study considers several ships’ domain models to give the characteristic of marine traffic in the waterways. Each encounter in the ship domain is analyzed to obtain the level of collision risk. Risk level of ships, as the result in this study, can be used as guidance to avoid the accident, providing brief description about safety traffic in Madura Strait and improving the navigational safety in the area.
Keywords: Automatic identification system, collision risk, DCPA, fuzzy inference system, TCPA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1586