Search results for: collaboration success factors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3255

Search results for: collaboration success factors

1935 A Challenge to Acquire Serious Victims’ Locations during Acute Period of Giant Disasters

Authors: Keiko Shimazu, Yasuhiro Maida, Tetsuya Sugata, Daisuke Tamakoshi, Kenji Makabe, Haruki Suzuki

Abstract:

In this paper, we report how to acquire serious victims’ locations in the Acute Stage of Large-scale Disasters, in an Emergency Information Network System designed by us. The background of our concept is based on the Great East Japan Earthquake occurred on March 11th, 2011. Through many experiences of national crises caused by earthquakes and tsunamis, we have established advanced communication systems and advanced disaster medical response systems. However, Japan was devastated by huge tsunamis swept a vast area of Tohoku causing a complete breakdown of all the infrastructures including telecommunications. Therefore, we noticed that we need interdisciplinary collaboration between science of disaster medicine, regional administrative sociology, satellite communication technology and systems engineering experts. Communication of emergency information was limited causing a serious delay in the initial rescue and medical operation. For the emergency rescue and medical operations, the most important thing is to identify the number of casualties, their locations and status and to dispatch doctors and rescue workers from multiple organizations. In the case of the Tohoku earthquake, the dispatching mechanism and/or decision support system did not exist to allocate the appropriate number of doctors and locate disaster victims. Even though the doctors and rescue workers from multiple government organizations have their own dedicated communication system, the systems are not interoperable.

Keywords: Crisis management, disaster mitigation, messing, MGRS, Satellite communication system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 824
1934 Empowering Student Success: Innovative Modelling Techniques for Enhancing Self-Efficacy in Education

Authors: Aldrin R. Logdat, Marianne Christine Jane B. Capio

Abstract:

The study aimed to investigate the impact of modelling techniques on the self-efficacy of first year Bachelor of Science Major in Hospitality Management (BSHM) college students at City College of Calapan, Oriental Mindoro. The research utilized a ten-point general self-efficacy scale and collected responses from a sample of 107 students across five BSHM sections. The study found that the majority of students had a moderate level of self-efficacy, with 49.53% of total respondents falling within this category. However, 35.51% of students had high self-efficacy, and 14.95% had low self-efficacy levels. The two-tailed t-test for independent samples indicated a significant difference between the mean post-test scores of the experimental and control groups. Furthermore, Wilcoxon test showed that there were significant differences in the experimental group's self-efficacy before and after treatment, while no such difference was observed in the control group. Thus, the modelling technique proved to be effective in improving the self-efficacy levels of first year BSHM college students. Ultimately, the use of modelling techniques helped to elevate students’ self-efficacy levels into higher categories.

Keywords: Self-efficacy, counselling, modelling techniques, hospitality management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 376
1933 Adaptive WiFi Fingerprinting for Location Approximation

Authors: Mohd Fikri Azli bin Abdullah, Khairul Anwar bin Kamarul Hatta, Esther Jeganathan

Abstract:

WiFi has become an essential technology that is widely used nowadays. It is famous due to its convenience to be used with mobile devices. This is especially true for Internet users worldwide that use WiFi connections. There are many location based services that are available nowadays which uses Wireless Fidelity (WiFi) signal fingerprinting. A common example that is gaining popularity in this era would be Foursquare. In this work, the WiFi signal would be used to estimate the user or client’s location. Similar to GPS, fingerprinting method needs a floor plan to increase the accuracy of location estimation. Still, the factor of inconsistent WiFi signal makes the estimation defer at different time intervals. Given so, an adaptive method is needed to obtain the most accurate signal at all times. WiFi signals are heavily distorted by external factors such as physical objects, radio frequency interference, electrical interference, and environmental factors to name a few. Due to these factors, this work uses a method of reducing the signal noise and estimation using the Nearest Neighbour based on past activities of the signal to increase the signal accuracy up to more than 80%. The repository yet increases the accuracy by using Artificial Neural Network (ANN) pattern matching. The repository acts as the server cum support of the client side application decision. Numerous previous works has adapted the methods of collecting signal strengths in the repository over the years, but mostly were just static. In this work, proposed solutions on how the adaptive method is done to match the signal received to the data in the repository are highlighted. With the said approach, location estimation can be done more accurately. Adaptive update allows the latest location fingerprint to be stored in the repository. Furthermore, any redundant location fingerprints are removed and only the updated version of the fingerprint is stored in the repository. How the location estimation of the user can be predicted would be highlighted more in the proposed solution section. After some studies on previous works, it is found that the Artificial Neural Network is the most feasible method to deploy in updating the repository and making it adaptive. The Artificial Neural Network functions are to do the pattern matching of the WiFi signal to the existing data available in the repository.

Keywords: Adaptive Repository, Artificial Neural Network, Location Estimation, Nearest Neighbour Euclidean Distance, WiFi RSSI Fingerprinting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3453
1932 Problems and Needs of Frozen Shrimp Industry Small and Medium Enterprises in the Central Region of the Lower Three Provinces

Authors: P. Thepnarintra

Abstract:

Frozen shrimp industry plays an important role in the development of production industry of the country. There has been a continuing development to response the increasing demand; however, there have been some problems in running the enterprises. The purposes of this study are to: 1) investigate problems related to basic factors in operating frozen shrimp industry based on the entrepreneurs’ points of view. The enterprises involved in this study were small and medium industry receiving Thai Frozen Foods Association. 2) Compare the problems of the frozen shrimp industry according to their sizes of operation in 3 provinces of the central region Thailand. Population in this study consisted of 148 managers from 148 frozen shrimp enterprises Thai Frozen Foods Association which 77 were small size and 71 were medium size. The data were analyzed to find percentage, arithmetic mean, standard deviation, and independent sample T-test with the significant hypothesis at .05. The results revealed that the problems of the frozen shrimp industries of both size were in high level. The needs for government supporting were in high level. The comparison of the problems and the basic factors between the small and medium size enterprises showed no statistically significant level. The problems that they mentioned included raw materials, labors, production, marketing, and the need for academic supporting from the government sector.

Keywords: Frozen shrimp industry, problems, related to the enterprise, operation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
1931 Understanding Innovation by Analyzing the Pillars of the Global Competitiveness Index

Authors: Ujjwala Bhand, Mridula Goel

Abstract:

Global Competitiveness Index (GCI) prepared by World Economic Forum has become a benchmark in studying the competitiveness of countries and for understanding the factors that enable competitiveness. Innovation is a key pillar in competitiveness and has the unique property of enabling exponential economic growth. This paper attempts to analyze how the pillars comprising the Global Competitiveness Index affect innovation and whether GDP growth can directly affect innovation outcomes for a country. The key objective of the study is to identify areas on which governments of developing countries can focus policies and programs to improve their country’s innovativeness. We have compiled a panel data set for top innovating countries and large emerging economies called BRICS from 2007-08 to 2014-15 in order to find the significant factors that affect innovation. The results of the regression analysis suggest that government should make policies to improve labor market efficiency, establish sophisticated business networks, provide basic health and primary education to its people and strengthen the quality of higher education and training services in the economy. The achievements of smaller economies on innovation suggest that concerted efforts by governments can counter any size related disadvantage, and in fact can provide greater flexibility and speed in encouraging innovation.

Keywords: Innovation, Global Competitiveness Index, BRICS, economic growth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1051
1930 Predicting Bankruptcy using Tabu Search in the Mauritian Context

Authors: J. Cheeneebash, K. B. Lallmamode, A. Gopaul

Abstract:

Throughout this paper, a relatively new technique, the Tabu search variable selection model, is elaborated showing how it can be efficiently applied within the financial world whenever researchers come across the selection of a subset of variables from a whole set of descriptive variables under analysis. In the field of financial prediction, researchers often have to select a subset of variables from a larger set to solve different type of problems such as corporate bankruptcy prediction, personal bankruptcy prediction, mortgage, credit scoring and the Arbitrage Pricing Model (APM). Consequently, to demonstrate how the method operates and to illustrate its usefulness as well as its superiority compared to other commonly used methods, the Tabu search algorithm for variable selection is compared to two main alternative search procedures namely, the stepwise regression and the maximum R 2 improvement method. The Tabu search is then implemented in finance; where it attempts to predict corporate bankruptcy by selecting the most appropriate financial ratios and thus creating its own prediction score equation. In comparison to other methods, mostly the Altman Z-Score model, the Tabu search model produces a higher success rate in predicting correctly the failure of firms or the continuous running of existing entities.

Keywords: Predicting Bankruptcy, Tabu Search

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
1929 Key Frame Based Video Summarization via Dependency Optimization

Authors: Janya Sainui

Abstract:

As a rapid growth of digital videos and data communications, video summarization that provides a shorter version of the video for fast video browsing and retrieval is necessary. Key frame extraction is one of the mechanisms to generate video summary. In general, the extracted key frames should both represent the entire video content and contain minimum redundancy. However, most of the existing approaches heuristically select key frames; hence, the selected key frames may not be the most different frames and/or not cover the entire content of a video. In this paper, we propose a method of video summarization which provides the reasonable objective functions for selecting key frames. In particular, we apply a statistical dependency measure called quadratic mutual informaion as our objective functions for maximizing the coverage of the entire video content as well as minimizing the redundancy among selected key frames. The proposed key frame extraction algorithm finds key frames as an optimization problem. Through experiments, we demonstrate the success of the proposed video summarization approach that produces video summary with better coverage of the entire video content while less redundancy among key frames comparing to the state-of-the-art approaches.

Keywords: Video summarization, key frame extraction, dependency measure, quadratic mutual information, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958
1928 Probabilistic Method of Wind Generation Placement for Congestion Management

Authors: S. Z. Moussavi, A. Badri, F. Rastegar Kashkooli

Abstract:

Wind farms (WFs) with high level of penetration are being established in power systems worldwide more rapidly than other renewable resources. The Independent System Operator (ISO), as a policy maker, should propose appropriate places for WF installation in order to maximize the benefits for the investors. There is also a possibility of congestion relief using the new installation of WFs which should be taken into account by the ISO when proposing the locations for WF installation. In this context, efficient wind farm (WF) placement method is proposed in order to reduce burdens on congested lines. Since the wind speed is a random variable and load forecasts also contain uncertainties, probabilistic approaches are used for this type of study. AC probabilistic optimal power flow (P-OPF) is formulated and solved using Monte Carlo Simulations (MCS). In order to reduce computation time, point estimate methods (PEM) are introduced as efficient alternative for time-demanding MCS. Subsequently, WF optimal placement is determined using generation shift distribution factors (GSDF) considering a new parameter entitled, wind availability factor (WAF). In order to obtain more realistic results, N-1 contingency analysis is employed to find the optimal size of WF, by means of line outage distribution factors (LODF). The IEEE 30-bus test system is used to show and compare the accuracy of proposed methodology.

Keywords: Probabilistic optimal power flow, Wind power, Pointestimate methods, Congestion management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
1927 Convergence Analysis of Training Two-Hidden-Layer Partially Over-Parameterized ReLU Networks via Gradient Descent

Authors: Zhifeng Kong

Abstract:

Over-parameterized neural networks have attracted a great deal of attention in recent deep learning theory research, as they challenge the classic perspective of over-fitting when the model has excessive parameters and have gained empirical success in various settings. While a number of theoretical works have been presented to demystify properties of such models, the convergence properties of such models are still far from being thoroughly understood. In this work, we study the convergence properties of training two-hidden-layer partially over-parameterized fully connected networks with the Rectified Linear Unit activation via gradient descent. To our knowledge, this is the first theoretical work to understand convergence properties of deep over-parameterized networks without the equally-wide-hidden-layer assumption and other unrealistic assumptions. We provide a probabilistic lower bound of the widths of hidden layers and proved linear convergence rate of gradient descent. We also conducted experiments on synthetic and real-world datasets to validate our theory.

Keywords: Over-parameterization, Rectified Linear Units (ReLU), convergence, gradient descent, neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 888
1926 The Effect of Temperature and Salinity on the Growth and Carotenogenesis of Three Dunaliella Species (Dunaliella sp. Lake Isolate, D. salina CCAP 19/18, and D. bardawil LB 2538) Cultivated under Laboratory Conditions

Authors: Imen Hamed, Burcu Ak, Oya Işık, Leyla Uslu, Kubilay Kazım Vursavuş

Abstract:

In this study, 3 species of Dunaliella (Dunaliella sp. Salt Lake isoalte (Tuz Gölü), Dunaliella salina CCAP19/18, and Dunaliella bardawil LB 2538) and their optical density, dry matter, chlorophyll a, total carotenoids, and β-carotene production were investigated in a batch system. The aim of this research was to compare carotenoids, and β-carotene production were investigated in a batch those 3 species. Therefore 2 stress factors were used: 2 different temperatures (20°C and 30°C) and 2 different salinities (30‰, and 60‰) were tested over a 17-day study. The highest growth and chlorophyll a was reported for Dunaliella sp. under 20°C/30‰ and 20°C/60‰ conditions respectively followed by D. bardawil and D. salina. Significant differences were noticed (p<0.05) for the other 3 species. The growth decreased as temperature and salinity increased since the lowest growth was noticed for the 30°C/60‰ group. The chlorophyll a content decreased also as temperature increased however when the NaCl concentration increased an augmentation of the content was noticed . In the 17th day of experiment the highest carotenoids concentration was reported for D. bardawil 20°C/30‰ (65,639±0,400 μg.mL1) and the most important β carotene concentration was for D. salina 20°C/60‰ (8,98E-07±0,013 mol/L).

Keywords: Dunaliella sp., Dunaliella salina, Dunaliella bardawil, stress factors, pigments, growth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 999
1925 An Efficient Algorithm for Motion Detection Based Facial Expression Recognition using Optical Flow

Authors: Ahmad R. Naghsh-Nilchi, Mohammad Roshanzamir

Abstract:

One of the popular methods for recognition of facial expressions such as happiness, sadness and surprise is based on deformation of facial features. Motion vectors which show these deformations can be specified by the optical flow. In this method, for detecting emotions, the resulted set of motion vectors are compared with standard deformation template that caused by facial expressions. In this paper, a new method is introduced to compute the quantity of likeness in order to make decision based on the importance of obtained vectors from an optical flow approach. For finding the vectors, one of the efficient optical flow method developed by Gautama and VanHulle[17] is used. The suggested method has been examined over Cohn-Kanade AU-Coded Facial Expression Database, one of the most comprehensive collections of test images available. The experimental results show that our method could correctly recognize the facial expressions in 94% of case studies. The results also show that only a few number of image frames (three frames) are sufficient to detect facial expressions with rate of success of about 83.3%. This is a significant improvement over the available methods.

Keywords: Facial expression, Facial features, Optical flow, Motion vectors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370
1924 The Significant Effect of Wudu’ and Zikr in the Controlling of Emotional Pressure Using Biofeedback Emwave Technique

Authors: Mohd Anuar Awang Idris, Muhammad Nubli Abdul Wahab, Nora Yusma Mohamed Yusoff

Abstract:

Wudu’ (Ablution) and Zikr are amongst some of the spiritual tools which may help an individual control his mind, emotion and attitude. These tools are deemed to be able to deliver a positive impact on an individual’s psychophysiology. The main objective of this research is to determine the effects of Wudu’ (Ablution) and Zikr therapy using the biofeedback emWave application and technology. For this research, 13 students were selected as samples from the students’ representative body at the University Tenaga National, Malaysia. The DASS (Depression Anxiety Stress Scale) questionnaire was used to help with the assessment and measurement of each student’s ability in controlling his or her emotions before and after the therapies. The biofeedback emWave technology was utilized to monitor the student’s psychophysiology level. In addition, the data obtained from the Heart rate variability (HRV) test have also been used to affirm that Wudu’ and Zikr had had significant impacts on the student’s success in controlling his or her emotional pressure.

Keywords: Biofeedback emWave, emotion, psychophysiology, wudu’, zikr.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
1923 Design of an Intelligent Location Identification Scheme Based On LANDMARC and BPNs

Authors: S. Chaisit, H.Y. Kung, N.T. Phuong

Abstract:

Radio frequency identification (RFID) applications have grown rapidly in many industries, especially in indoor location identification. The advantage of using received signal strength indicator (RSSI) values as an indoor location measurement method is a cost-effective approach without installing extra hardware. Because the accuracy of many positioning schemes using RSSI values is limited by interference factors and the environment, thus it is challenging to use RFID location techniques based on integrating positioning algorithm design. This study proposes the location estimation approach and analyzes a scheme relying on RSSI values to minimize location errors. In addition, this paper examines different factors that affect location accuracy by integrating the backpropagation neural network (BPN) with the LANDMARC algorithm in a training phase and an online phase. First, the training phase computes coordinates obtained from the LANDMARC algorithm, which uses RSSI values and the real coordinates of reference tags as training data for constructing an appropriate BPN architecture and training length. Second, in the online phase, the LANDMARC algorithm calculates the coordinates of tracking tags, which are then used as BPN inputs to obtain location estimates. The results show that the proposed scheme can estimate locations more accurately compared to LANDMARC without extra devices.

Keywords: BPNs, indoor location, location estimation, intelligent location identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2005
1922 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1049
1921 Knowledge and Organisational Success: Developing a Scale of Knowledge Framework

Authors: Mohammed Almohammedali, Peter Duncan, David Edgar

Abstract:

The aim of this exploratory research is to understand further how organisations can evaluate their activities, which generate knowledge creation, to meet changing stakeholder expectations. A Scale of Knowledge (SoK) Framework is proposed which links knowledge management and organisational activities to changing stakeholder expectations. The framework was informed by the knowledge management literature, as well as empirical work conducted via a single case study of a multi-site hospital organisation in Saudi Arabia. Eight in-depth semi-structured interviews were conducted with managers from across the organisation regarding current and future stakeholder expectations, organisational strategy/activities and knowledge management. Data were analysed using thematic analysis and a hierarchical value map technique to identify activities that can produce further knowledge and consequently impact on how stakeholder expectations are met. The SoK Framework developed may be useful to practitioners as an analytical aid to determine if current organisational activities produce organisational knowledge which helps them meet (increasingly higher levels of) stakeholder expectations. The limitations of the research and avenues for future development of the proposed framework are discussed.

Keywords: Knowledge creation, knowledge management, organisational knowledge, scale of knowledge, knowledge impact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
1920 Response Delay Model: Bridging the Gap in Urban Fire Disaster Response System

Authors: Sulaiman Yunus

Abstract:

The need for modeling response to urban fire disaster cannot be over emphasized, as recurrent fire outbreaks have gutted most cities of the world. This necessitated the need for a prompt and efficient response system in order to mitigate the impact of the disaster. Promptness, as a function of time, is seen to be the fundamental determinant for efficiency of a response system and magnitude of a fire disaster. Delay, as a result of several factors, is one of the major determinants of promptgness of a response system and also the magnitude of a fire disaster. Response Delay Model (RDM) intends to bridge the gap in urban fire disaster response system through incorporating and synchronizing the delay moments in measuring the overall efficiency of a response system and determining the magnitude of a fire disaster. The model identified two delay moments (pre-notification and Intra-reflex sequence delay) that can be elastic and collectively plays a significant role in influencing the efficiency of a response system. Due to variation in the elasticity of the delay moments, the model provides for measuring the length of delays in order to arrive at a standard average delay moment for different parts of the world, putting into consideration geographic location, level of preparedness and awareness, technological advancement, socio-economic and environmental factors. It is recommended that participatory researches should be embarked on locally and globally to determine standard average delay moments within each phase of the system so as to enable determining the efficiency of response systems and predicting fire disaster magnitudes.

Keywords: Delay moment, fire disaster, reflex sequence, response, response delay moment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 726
1919 The Conceptual Design Model of an Automated Supermarket

Authors: Sathya Narayanan V., Sidharth P., Sanal Kumar. V. R.

Abstract:

The success of any retail business is predisposed by its swift response and its knack in understanding the constraints and the requirements of customers. In this paper a conceptual design model of an automated customer-friendly supermarket has been proposed. In this model a 10-sided, space benefited, regular polygon shaped gravity shelves have been designed for goods storage and effective customer-specific algorithms have been built-in for quick automatic delivery of the randomly listed goods. The algorithm is developed with two main objectives, viz., delivery time and priority. For meeting these objectives the randomly listed items are reorganized according to the critical-path of the robotic arm specific to the identified shop and its layout and the items are categorized according to the demand, shape, size, similarity and nature of the product for an efficient pick-up, packing and delivery process. We conjectured that the proposed automated supermarket model reduces business operating costs with much customer satisfaction warranting a winwin situation.

Keywords: Automated Supermarket, Electronic Shopping, Polygon-shaped Rack, Shortest Path Algorithm for Shopping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3237
1918 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand

Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo

Abstract:

An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.

Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
1917 Differences in the Perception of Behavior Problems in Pre-school Children among the Teachers and Parents

Authors: Jana Kožárová

Abstract:

Even the behavior problems in pre-school children might be considered as a transitional problem which may disappear by their transition into elementary school; it is an issue that needs a lot of attention because of the fact that the behavioral patterns are adopted in the children especially in this age. Common issue in the process of elimination of the behavior problems in the group of pre-school children is a difference in the perception of the importance and gravity of the symptoms. The underestimation of the children's problems by parents often result into conflicts with kindergarten teachers. Thus, the child does not get the support that his/her problems require and this might result into a school failure and can negatively influence his/her future school performance and success. The research sample consisted of 4 children with behavior problems, their teachers and parents. To determine the most problematic area in the child's behavior, Child Behavior Checklist (CBCL) filled by parents and Caregiver/Teacher Form (CTF-R) filled by teachers were used. Scores from the CBCL and the CTR-F were compared with Pearson correlation coefficient in order to find the differences in the perception of behavior problems in pre-school children.

Keywords: Behavior problems, child behavior checklist, caregiver/teacher form, Pearson correlation coefficient, pre-school age.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
1916 Air Quality Forecast Based on Principal Component Analysis-Genetic Algorithm and Back Propagation Model

Authors: Bin Mu, Site Li, Shijin Yuan

Abstract:

Under the circumstance of environment deterioration, people are increasingly concerned about the quality of the environment, especially air quality. As a result, it is of great value to give accurate and timely forecast of AQI (air quality index). In order to simplify influencing factors of air quality in a city, and forecast the city’s AQI tomorrow, this study used MATLAB software and adopted the method of constructing a mathematic model of PCA-GABP to provide a solution. To be specific, this study firstly made principal component analysis (PCA) of influencing factors of AQI tomorrow including aspects of weather, industry waste gas and IAQI data today. Then, we used the back propagation neural network model (BP), which is optimized by genetic algorithm (GA), to give forecast of AQI tomorrow. In order to verify validity and accuracy of PCA-GABP model’s forecast capability. The study uses two statistical indices to evaluate AQI forecast results (normalized mean square error and fractional bias). Eventually, this study reduces mean square error by optimizing individual gene structure in genetic algorithm and adjusting the parameters of back propagation model. To conclude, the performance of the model to forecast AQI is comparatively convincing and the model is expected to take positive effect in AQI forecast in the future.

Keywords: AQI forecast, principal component analysis, genetic algorithm, back propagation neural network model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1020
1915 Effectual Role of Local Level Partnership Schemes in Affordable Housing Delivery

Authors: Hala S. Mekawy

Abstract:

Affordable housing delivery for low and lower middle income families is a prominent problem in many developing countries; governments alone are unable to address this challenge due to diverse financial and regulatory constraints, and the private sector's contribution is rare and assists only middle-income households even when institutional and legal reforms are conducted to persuade it to go down market. Also, the market-enabling policy measures advocated by the World Bank since the early nineties have been strongly criticized and proven to be inappropriate to developing country contexts, where it is highly unlikely that the formal private sector can reach low income population. In addition to governments and private developers, affordable housing delivery systems involve an intricate network of relationships between a diverse range of actors. Collaboration between them was proven to be vital, and hence, an approach towards partnership schemes for affordable housing delivery has emerged. The basic premise of this paper is that addressing housing affordability challenges in Egypt demands direct public support, as markets and market actors alone would never succeed in delivering decent affordable housing to low and lower middle income groups. It argues that this support would ideally be through local level partnership schemes, with a leading decentralized local government role, and partners being identified according to specific local conditions. It attempts to identify major attributes that would ensure the fulfillment of the goals of such schemes in the Egyptian context. This is based upon evidence from diversified worldwide experiences, in addition to the main outcomes of a questionnaire that was conducted to specialists and chief actors in the field.

Keywords: Affordable housing, partnership schemes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2794
1914 Improving Production Capacity through Efficient PPC System: Lesson from Leather Manufacturing

Authors: Mengist Hailemariam, Silma Yoseph

Abstract:

A well designed and executed Production Planning and Control (PPC) system is one of the key levers for superior performance in the current manufacturing set-up. Hence, measuring the PPC system performance has become a necessity for long term success. The present study examined PPC related issues which impact the production capacity and productivity of leather companies with special focus on Kombolcha Tannery Share Company (KTSC), Ethiopia. Physical observation, interview, and questionnaire were used to generate necessary information from the respondents and reach valid conclusions. Company annual reports were referred and analyzed to triangulate primary data. Consequently, the study revealed that KTSC runs below its capacity due to its inefficient PPC system being in use for which the root causes were identified. The study thereby conceptualizes a PPC system improvement framework comprising three pillars viz., management culture, internal capability and performance measurement together with key considerations in each case. The study findings enable the company to recognize the importance of efficient PPC system as a source of competitive advantage. It also aid managers in evaluating various PPC execution schemes to enhance productivity.

Keywords: Ethiopia, Leather manufacturing, Production planning and control, PPC improvement framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3727
1913 A Combinatorial Approach to Planning Manufacturing Safety Programme

Authors: Kazeem A. Adebiyi

Abstract:

Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme

Keywords: Combination, Manufacturing Safety, Monetary Savings, Prevention Strategies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1246
1912 Dynamic Features Selection for Heart Disease Classification

Authors: Walid MOUDANI

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: Multi-Classifier Decisions Tree, Features Reduction, Dynamic Programming, Rough Sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2528
1911 Temperature Susceptibility of Multigrade Bitumen Asphalt and an Approach to Account for Temperature Variation through Deep Pavements

Authors: Brody R. Clark, Chaminda Gallage, John Yeaman

Abstract:

Multigrade bitumen asphalt is a quality asphalt product that is not utilised in many places globally. Multigrade bitumen is believed to be less sensitive to temperature, which gives it an advantage over conventional binders. Previous testing has shown that asphalt temperature changes greatly with depth, but currently the industry standard is to nominate a single temperature for design. For detailed design of asphalt roads, perhaps asphalt layers should be divided into nominal layer depths and different modulus and fatigue equations/values should be used to reflect the temperatures of each respective layer. A collaboration of previous laboratory testing conducted on multigrade bitumen asphalt beams under a range of temperatures and loading conditions was analysed. The samples tested included 0% or 15% recycled asphalt pavement (RAP) to determine what impact the recycled material has on the fatigue life and stiffness of the pavement. This paper investigated the temperature susceptibility of multigrade bitumen asphalt pavements compared to conventional binders by combining previous testing that included conducting a sweep of fatigue tests, developing complex modulus master curves for each mix and a study on how pavement temperature changes through pavement depth. This investigation found that the final design of the pavement is greatly affected by the nominated pavement temperature and respective material properties. This paper has outlined a potential revision to the current design approach for asphalt pavements and proposes that further investigation is needed into pavement temperature and its incorporation into design.

Keywords: Asphalt, complex modulus, fatigue life, flexural stiffness, four-point bending, master curves, multigrade bitumen, thermal gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 770
1910 Effect of Environmental Factors on Photoreactivation of Microorganisms under Indoor Conditions

Authors: Shirin Shafaei, James R. Bolton, Mohamed Gamal El Din

Abstract:

Ultraviolet (UV) disinfection causes damage to the DNA or RNA of microorganisms, but many microorganisms can repair this damage after exposure to near-UV or visible wavelengths (310–480 nm) by a mechanism called photoreactivation. Photoreactivation is gaining more attention because it can reduce the efficiency of UV disinfection of wastewater several hours after treatment. The focus of many photoreactivation research activities on the single species has caused a considerable lack in knowledge about complex natural communities of microorganisms and their response to UV treatment. In this research, photoreactivation experiments were carried out on the influent of the UV disinfection unit at a municipal wastewater treatment plant (WWTP) in Edmonton, Alberta after exposure to a Medium-Pressure (MP) UV lamp system to evaluate the effect of environmental factors on photoreactivation of microorganisms in the actual municipal wastewater. The effect of reactivation fluence, temperature, and river water on photoreactivation of total coliforms was examined under indoor conditions. The results showed that higher effective reactivation fluence values (up to 20 J/cm2) and higher temperatures (up to 25 °C) increased the photoreactivation of total coliforms. However, increasing the percentage of river in the mixtures of the effluent and river water decreased the photoreactivation of the mixtures. The results of this research can help the municipal wastewater treatment industry to examine the environmental effects of discharging their effluents into receiving waters.

Keywords: Photoreactivation, reactivation fluence, river water, temperature, ultraviolet disinfection, wastewater effluent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1399
1909 An Overview of Georgia’s Economic Growth Since 2012: Current Status, Challenges, and Opportunities for Future Development

Authors: V. Benidze

Abstract:

After the Rose Revolution of 2003, Georgia has achieved an unparalleled socioeconomic success. However, economic growth since 2012 has been sluggish and certainly not enough to rapidly improve the county’s standard of living that still remains substantially low compared to that in developed nations. Recent poor economic performance has shown that some key challenges need to be addressed if Georgia is to achieve high future economic growth that will decrease the poverty rate and create a middle class in the country. This paper offers in detail analysis of the economic performance of Georgia since 2012 and identifies key challenges facing the country’s economy. The main challenge going forward will be transforming Georgia from a consumption-driven to a production-oriented economy. It is identified that mobilizing domestic investment through savings, attracting foreign investment in tradable sectors and expanding the country’s export base will be crucial in the facilitation of the above-mentioned structural transformation. As the outcome of the research, the paper suggests a strategy for accelerating Georgia’ future economic growth and offers recommendations based on the relevant conclusions.

Keywords: Challenges, development, economic growth, economic policy, Georgia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 949
1908 Intertidal Fixed Stake Net Trap (Hadrah) Fishery in Kuwait: Distribution, Catch Rate and Species Composition

Authors: Ali F. Al-Baz, Mohsen M. Al-Husaini, James M. Bishop

Abstract:

Intertidal fixed stake net trap (Hadrah) is one of the oldest fishing gears used throughout the Arabian Gulf countries since the 1800s and also one of most the efficient methods of capturing fish from the intertidal area. This study describes the hadrah fishery in Kuwait.

From October 2001 to December 2002, more than 37,372 specimens representing 95 species (89 fish, 2 mollusks and 4 crustaceans) were measured from hadrah, located in three different areas along Kuwait's coast. In Kuwait Bay, catch rates averaged 62 kg/sir-day (from 14 kg/sir-day in February to 160 kg/sir-day in October 2002). Commercial species accounted for 41% of the catches. Catches from Failakah Island averaged 96 kg/sir-day from June to September, with 61% of the catch being commercial species. In the southern area, catches averaged only 32 kg/sir-day and only 34% were commercially important.

Forty percent of the hadrah catches were juveniles, which shows that Kuwait’s shallow intertidal waters, particularly in Kuwait Bay, served as prime nursery habitat,. To maintain ecosystem biodiversity and recruitment success of the fishes, we recommended that all hadrah should be removed from Kuwait Bay. In the future, removal of hadrah from other locations should be considered.

Keywords: Catch and effort, Hadrah, Intertidal Fixed stake net, Kuwait, Species composition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2269
1907 Empirical Process Monitoring Via Chemometric Analysis of Partially Unbalanced Data

Authors: Hyun-Woo Cho

Abstract:

Real-time or in-line process monitoring frameworks are designed to give early warnings for a fault along with meaningful identification of its assignable causes. In artificial intelligence and machine learning fields of pattern recognition various promising approaches have been proposed such as kernel-based nonlinear machine learning techniques. This work presents a kernel-based empirical monitoring scheme for batch type production processes with small sample size problem of partially unbalanced data. Measurement data of normal operations are easy to collect whilst special events or faults data are difficult to collect. In such situations, noise filtering techniques can be helpful in enhancing process monitoring performance. Furthermore, preprocessing of raw process data is used to get rid of unwanted variation of data. The performance of the monitoring scheme was demonstrated using three-dimensional batch data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: Process Monitoring, kernel methods, multivariate filtering, data-driven techniques, quality improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
1906 How Celebrities can be used in Advertising to the Best Advantage?

Authors: Laimona Sliburyte

Abstract:

The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.

Keywords: Advertising, celebrity, celebrity endorsements, effectiveness of celebrity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3679