Search results for: process model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27928

Search results for: process model

22678 The Assesment of Animal Welfare at Slaughterhouses in Badung District, Bali Province

Authors: Ulil Afidah, Mustopa

Abstract:

The study aims to determine the assessment of animal welfare at slaughterhouses in Badung district, Bali province. The study was conducted for ten days with observed five cattle per day with a total 50 cattle. Observation begins when a cow came out of the pick up to be slaughtered, subsequently recorded in a questionnaire that has been provided.The result of the observation showed that the slaughterhouses in Bandung district have the implemented animal welfare which fulfills the requirement that is 63% before slaughtering process, and 76% at slaughtering process. Based on these results it can be concluded in slaughterhouses of Badung district already fulfill the requirements.

Keywords: animal welfare, assesment, Badung district, slaughterhousess

Procedia PDF Downloads 265
22677 Predicting Photovoltaic Energy Profile of Birzeit University Campus Based on Weather Forecast

Authors: Muhammad Abu-Khaizaran, Ahmad Faza’, Tariq Othman, Yahia Yousef

Abstract:

This paper presents a study to provide sufficient and reliable information about constructing a Photovoltaic energy profile of the Birzeit University campus (BZU) based on the weather forecast. The developed Photovoltaic energy profile helps to predict the energy yield of the Photovoltaic systems based on the weather forecast and hence helps planning energy production and consumption. Two models will be developed in this paper; a Clear Sky Irradiance model and a Cloud-Cover Radiation model to predict the irradiance for a clear sky day and a cloudy day, respectively. The adopted procedure for developing such models takes into consideration two levels of abstraction. First, irradiance and weather data were acquired by a sensory (measurement) system installed on the rooftop of the Information Technology College building at Birzeit University campus. Second, power readings of a fully operational 51kW commercial Photovoltaic system installed in the University at the rooftop of the adjacent College of Pharmacy-Nursing and Health Professions building are used to validate the output of a simulation model and to help refine its structure. Based on a comparison between a mathematical model, which calculates Clear Sky Irradiance for the University location and two sets of accumulated measured data, it is found that the simulation system offers an accurate resemblance to the installed PV power station on clear sky days. However, these comparisons show a divergence between the expected energy yield and actual energy yield in extreme weather conditions, including clouding and soiling effects. Therefore, a more accurate prediction model for irradiance that takes into consideration weather factors, such as relative humidity and cloudiness, which affect irradiance, was developed; Cloud-Cover Radiation Model (CRM). The equivalent mathematical formulas implement corrections to provide more accurate inputs to the simulation system. The results of the CRM show a very good match with the actual measured irradiance during a cloudy day. The developed Photovoltaic profile helps in predicting the output energy yield of the Photovoltaic system installed at the University campus based on the predicted weather conditions. The simulation and practical results for both models are in a very good match.

Keywords: clear-sky irradiance model, cloud-cover radiation model, photovoltaic, weather forecast

Procedia PDF Downloads 117
22676 Studying the Effect of Ethanol and Operating Temperature on Purification of Lactulose Syrup Containing Lactose

Authors: N. Zanganeh, M. Zabet

Abstract:

Lactulose is a synthetic disaccharide which has remarkable applications in food and pharmaceutical fields. Lactulose is not found in nature and it is produced by isomerization reaction of lactose in an alkaline environment. It should be noted that this reaction has a very low yield since significant amount of lactose stays un-reacted in the system. Basically, purification of lactulose is difficult and costly. Previous studies have revealed that solubility of lactose and lactulose are significantly different in ethanol. Considering the fact that solubility is also affected by temperature itself, we investigated the effect of ethanol and temperature on separation process of lactose from the syrup containing lactose and lactulose. For this purpose, a saturated solution containing lactulose and lactose was made at three different temperatures; 25⁰C (room temperature), 31⁰C, and 37⁰C first.  Five samples containing 2g saturated solution was taken and then 2g, 3g, 4g, 5g, and 6g ethanol separately was added to the sampling tubes. Sampling tubes were kept at respective temperatures afterward. The concentration of lactose and lactulose after separation process measured and analyzed by High Performance Liquid Chromatography (HPLC). Results showed that ethanol has such a greater impact than operating temperature on purification process. Also, it was observed that the maximum rate of separation occurred at initial amount of added ethanol.

Keywords: lactulose, lactose, purification, solubility

Procedia PDF Downloads 441
22675 Toward Indoor and Outdoor Surveillance using an Improved Fast Background Subtraction Algorithm

Authors: El Harraj Abdeslam, Raissouni Naoufal

Abstract:

The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes in variance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.

Keywords: video surveillance, background subtraction, contrast limited histogram equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes

Procedia PDF Downloads 242
22674 From Intuitive to Constructive Audit Risk Assessment: A Complementary Approach to CAATTs Adoption

Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy

Abstract:

The use of the audit risk model in auditing has faced limitations and difficulties, leading auditors to rely on a conceptual level of its application. The qualitative approach to assessing risks has resulted in different risk assessments, affecting the quality of audits and decision-making on the adoption of CAATTs. This study aims to investigate risk factors impacting the implementation of the audit risk model and propose a complementary risk-based instrument (KRIs) to form substance risk judgments and mitigate against heightened risk of material misstatement (RMM). The study addresses the question of how risk factors impact the implementation of the audit risk model, improve risk judgments, and aid in the adoption of CAATTs. The study uses a three-stage scale development procedure involving a pretest and subsequent study with two independent samples. The pretest involves an exploratory factor analysis, while the subsequent study employs confirmatory factor analysis for construct validation. Additionally, the authors test the ability of the KRIs to predict audit efforts needed to mitigate against heightened RMM. Data was collected through two independent samples involving 767 participants. The collected data was analyzed using exploratory factor analysis and confirmatory factor analysis to assess scale validity and construct validation. The suggested KRIs, comprising two risk components and seventeen risk items, are found to have high predictive power in determining audit efforts needed to reduce RMM. The study validates the suggested KRIs as an effective instrument for risk assessment and decision-making on the adoption of CAATTs. This study contributes to the existing literature by implementing a holistic approach to risk assessment and providing a quantitative expression of assessed risks. It bridges the gap between intuitive risk evaluation and the theoretical domain, clarifying the mechanism of risk assessments. It also helps improve the uniformity and quality of risk assessments, aiding audit standard-setters in issuing updated guidelines on CAATT adoption. A few limitations and recommendations for future research should be mentioned. First, the process of developing the scale was conducted in the Israeli auditing market, which follows the International Standards on Auditing (ISAs). Although ISAs are adopted in European countries, for greater generalization, future studies could focus on other countries that adopt additional or local auditing standards. Second, this study revealed risk factors that have a material impact on the assessed risk. However, there could be additional risk factors that influence the assessment of the RMM. Therefore, future research could investigate other risk segments, such as operational and financial risks, to bring a broader generalizability to our results. Third, although the sample size in this study fits acceptable scale development procedures and enables drawing conclusions from the body of research, future research may develop standardized measures based on larger samples to reduce the generation of equivocal results and suggest an extended risk model.

Keywords: audit risk model, audit efforts, CAATTs adoption, key risk indicators, sustainability

Procedia PDF Downloads 62
22673 Development of a Classification Model for Value-Added and Non-Value-Added Operations in Retail Logistics: Insights from a Supermarket Case Study

Authors: Helena Macedo, Larissa Tomaz, Levi Guimarães, Luís Cerqueira-Pinto, José Dinis-Carvalho

Abstract:

In the context of retail logistics, the pursuit of operational efficiency and cost optimization involves a rigorous distinction between value-added and non-value-added activities. In today's competitive market, optimizing efficiency and reducing operational costs are paramount for retail businesses. This research paper focuses on the development of a classification model adapted to the retail sector, specifically examining internal logistics processes. Based on a comprehensive analysis conducted in a retail supermarket located in the north of Portugal, which covered various aspects of internal retail logistics, this study questions the concept of value and the definition of wastes traditionally applied in a manufacturing context and proposes a new way to assess activities in the context of internal logistics. This study combines quantitative data analysis with qualitative evaluations. The proposed classification model offers a systematic approach to categorize operations within the retail logistics chain, providing actionable insights for decision-makers to streamline processes, enhance productivity, and allocate resources more effectively. This model contributes not only to academic discourse but also serves as a practical tool for retail businesses, aiding in the enhancement of their internal logistics dynamics.

Keywords: lean retail, lean logisitcs, retail logistics, value-added and non-value-added

Procedia PDF Downloads 42
22672 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality

Authors: David Sinfield, Thomas Cochrane, Marcos Steagall

Abstract:

While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.

Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications

Procedia PDF Downloads 229
22671 3D CFD Modelling of the Airflow and Heat Transfer in Cold Room Filled with Dates

Authors: Zina Ghiloufi, Tahar Khir

Abstract:

A transient three-dimensional computational fluid dynamics (CFD) model is developed to determine the velocity and temperature distribution in different positions cold room during pre-cooling of dates. The turbulence model used is the k-ω Shear Stress Transport (SST) with the standard wall function, the air. The numerical results obtained show that cooling rate is not uniform inside the room; the product at the medium of room has a slower cooling rate. This cooling heterogeneity has a large effect on the energy consumption during cold storage.

Keywords: CFD, cold room, cooling rate, dDates, numerical simulation, k-ω (SST)

Procedia PDF Downloads 221
22670 Tests for Zero Inflation in Count Data with Measurement Error in Covariates

Authors: Man-Yu Wong, Siyu Zhou, Zhiqiang Cao

Abstract:

In quality of life, health service utilization is an important determinant of medical resource expenditures on Colorectal cancer (CRC) care, a better understanding of the increased utilization of health services is essential for optimizing the allocation of healthcare resources to services and thus for enhancing the service quality, especially for high expenditure on CRC care like Hong Kong region. In assessing the association between the health-related quality of life (HRQOL) and health service utilization in patients with colorectal neoplasm, count data models can be used, which account for over dispersion or extra zero counts. In our data, the HRQOL evaluation is a self-reported measure obtained from a questionnaire completed by the patients, misreports and variations in the data are inevitable. Besides, there are more zero counts from the observed number of clinical consultations (observed frequency of zero counts = 206) than those from a Poisson distribution with mean equal to 1.33 (expected frequency of zero counts = 156). This suggests that excess of zero counts may exist. Therefore, we study tests for detecting zero-inflation in models with measurement error in covariates. Method: Under classical measurement error model, the approximate likelihood function for zero-inflation Poisson regression model can be obtained, then Approximate Maximum Likelihood Estimation(AMLE) can be derived accordingly, which is consistent and asymptotically normally distributed. By calculating score function and Fisher information based on AMLE, a score test is proposed to detect zero-inflation effect in ZIP model with measurement error. The proposed test follows asymptotically standard normal distribution under H0, and it is consistent with the test proposed for zero-inflation effect when there is no measurement error. Results: Simulation results show that empirical power of our proposed test is the highest among existing tests for zero-inflation in ZIP model with measurement error. In real data analysis, with or without considering measurement error in covariates, existing tests, and our proposed test all imply H0 should be rejected with P-value less than 0.001, i.e., zero-inflation effect is very significant, ZIP model is superior to Poisson model for analyzing this data. However, if measurement error in covariates is not considered, only one covariate is significant; if measurement error in covariates is considered, only another covariate is significant. Moreover, the direction of coefficient estimations for these two covariates is different in ZIP regression model with or without considering measurement error. Conclusion: In our study, compared to Poisson model, ZIP model should be chosen when assessing the association between condition-specific HRQOL and health service utilization in patients with colorectal neoplasm. and models taking measurement error into account will result in statistically more reliable and precise information.

Keywords: count data, measurement error, score test, zero inflation

Procedia PDF Downloads 271
22669 Characteristics of Business Models of Industrial-Internet-of-Things Platforms

Authors: Peter Kress, Alexander Pflaum, Ulrich Loewen

Abstract:

The number of Internet-of-Things (IoT) platforms is steadily increasing across various industries, especially for smart factories, smart homes and smart mobility. Also in the manufacturing industry, the number of Industrial-IoT platforms is growing. Both IT players, start-ups and increasingly also established industry players and small-and-medium-enterprises introduce offerings for the connection of industrial equipment on platforms, enabled by advanced information and communication technology. Beside the offered functionalities, the established ecosystem of partners around a platform is one of the key differentiators to generate a competitive advantage. The key question is how platform operators design the business model around their platform to attract a high number of customers and partners to co-create value for the entire ecosystem. The present research tries to answer this question by determining the key characteristics of business models of successful platforms in the manufacturing industry. To achieve that, the authors selected an explorative qualitative research approach and created an inductive comparative case study. The authors generated valuable descriptive insights of the business model elements (e.g., value proposition, pricing model or partnering model) of various established platforms. Furthermore, patterns across the various cases were identified to derive propositions for the successful design of business models of platforms in the manufacturing industry.

Keywords: industrial-internet-of-things, business models, platforms, ecosystems, case study

Procedia PDF Downloads 227
22668 A Biophysical Model of CRISPR/Cas9 on- and off-Target Binding for Rational Design of Guide RNAs

Authors: Iman Farasat, Howard M. Salis

Abstract:

The CRISPR/Cas9 system has revolutionized genome engineering by enabling site-directed and high-throughput genome editing, genome insertion, and gene knockdowns in several species, including bacteria, yeast, flies, worms, and human cell lines. This technology has the potential to enable human gene therapy to treat genetic diseases and cancer at the molecular level; however, the current CRISPR/Cas9 system suffers from seemingly sporadic off-target genome mutagenesis that prevents its use in gene therapy. A comprehensive mechanistic model that explains how the CRISPR/Cas9 functions would enable the rational design of the guide-RNAs responsible for target site selection while minimizing unexpected genome mutagenesis. Here, we present the first quantitative model of the CRISPR/Cas9 genome mutagenesis system that predicts how guide-RNA sequences (crRNAs) control target site selection and cleavage activity. We used statistical thermodynamics and law of mass action to develop a five-step biophysical model of cas9 cleavage, and examined it in vivo and in vitro. To predict a crRNA's binding specificities and cleavage rates, we then compiled a nearest neighbor (NN) energy model that accounts for all possible base pairings and mismatches between the crRNA and the possible genomic DNA sites. These calculations correctly predicted crRNA specificity across 5518 sites. Our analysis reveals that cas9 activity and specificity are anti-correlated, and, the trade-off between them is the determining factor in performing an RNA-mediated cleavage with minimal off-targets. To find an optimal solution, we first created a scheme of safe-design criteria for Cas9 target selection by systematic analysis of available high throughput measurements. We then used our biophysical model to determine the optimal Cas9 expression levels and timing that maximizes on-target cleavage and minimizes off-target activity. We successfully applied this approach in bacterial and mammalian cell lines to reduce off-target activity to near background mutagenesis level while maintaining high on-target cleavage rate.

Keywords: biophysical model, CRISPR, Cas9, genome editing

Procedia PDF Downloads 387
22667 Consensus Reaching Process and False Consensus Effect in a Problem of Portfolio Selection

Authors: Viviana Ventre, Giacomo Di Tollo, Roberta Martino

Abstract:

The portfolio selection problem includes the evaluation of many criteria that are difficult to compare directly and is characterized by uncertain elements. The portfolio selection problem can be modeled as a group decision problem in which several experts are invited to present their assessment. In this context, it is important to study and analyze the process of reaching a consensus among group members. Indeed, due to the various diversities among experts, reaching consensus is not necessarily always simple and easily achievable. Moreover, the concept of consensus is accompanied by the concept of false consensus, which is particularly interesting in the dynamics of group decision-making processes. False consensus can alter the evaluation and selection phase of the alternative and is the consequence of the decision maker's inability to recognize that his preferences are conditioned by subjective structures. The present work aims to investigate the dynamics of consensus attainment in a group decision problem in which equivalent portfolios are proposed. In particular, the study aims to analyze the impact of the subjective structure of the decision-maker during the evaluation and selection phase of the alternatives. Therefore, the experimental framework is divided into three phases. In the first phase, experts are sent to evaluate the characteristics of all portfolios individually, without peer comparison, arriving independently at the selection of the preferred portfolio. The experts' evaluations are used to obtain individual Analytical Hierarchical Processes that define the weight that each expert gives to all criteria with respect to the proposed alternatives. This step provides insight into how the decision maker's decision process develops, step by step, from goal analysis to alternative selection. The second phase includes the description of the decision maker's state through Markov chains. In fact, the individual weights obtained in the first phase can be reviewed and described as transition weights from one state to another. Thus, with the construction of the individual transition matrices, the possible next state of the expert is determined from the individual weights at the end of the first phase. Finally, the experts meet, and the process of reaching consensus is analyzed by considering the single individual state obtained at the previous stage and the false consensus bias. The work contributes to the study of the impact of subjective structures, quantified through the Analytical Hierarchical Process, and how they combine with the false consensus bias in group decision-making dynamics and the consensus reaching process in problems involving the selection of equivalent portfolios.

Keywords: analytical hierarchical process, consensus building, false consensus effect, markov chains, portfolio selection problem

Procedia PDF Downloads 82
22666 The Tramway in French Cities: Complication of Public Spaces and Complexity of the Design Process

Authors: Elisa Maître

Abstract:

The redeployment of tram networks in French cities has considerably modified public spaces and the way citizens use them. Above and beyond the image that trams have of contributing to the sustainable urban development, the question of safety for users in these spaces has not been studied much. This study is based on an analysis of use of public spaces laid out for trams, from the standpoint of legibility and safety concerns. The study also examines to what extent the complexity of the design process, with many interactions between numerous and varied players in this process has a role in the genesis of these problems. This work is mainly based on the analysis of links between the uses of these re-designed public spaces (through observations, interviews of users and accident studies) and the analysis of the design conditions and processes of the projects studied (mainly based on interviews with the actors of these projects). Practical analyses were based three points of view: that of the planner, that of the user (based on observations and interviews) and that of the road safety expert. The cities of Montpellier, Marseille and Nice are the three fields of study on which the demonstration of this thesis is based. On part, the results of this study allow showing that the insertion of tram poses some problems complication of public areas of French cities. These complications related to the restructuring of public spaces for the tram, create difficulties of use and safety concerns. On the other hand, interviews depth analyses, fully transcribed, have led us to develop particular dysfunction scenarios in the design process. These elements lead to question the way the legibility and safety of these new forms of public spaces are taken into account. Then, an in-depth analysis of the design processes of public spaces with trams systems would also be a way of better understanding the choices made, the compromises accepted, and the conflicts and constraints at work, weighing on the layout of these spaces. The results presented concerning the impact that spaces laid out for trams have on the difficulty of use, suggest different possibilities for improving the way in which safety for all users is taken into account in designing public spaces.

Keywords: public spaces, road layout, users, design process of urban projects

Procedia PDF Downloads 215
22665 University Short Courses Web Application Using ASP.Net

Authors: Ahmed Hariri

Abstract:

E-Learning has become a necessity in the advanced education. It is easier for the student and teacher communication also it speed up the process with less time and less effort. With the progress and the enormous development of distance education must keep up with this age of making a website that allows students and teachers to take all the advantages of advanced education. In this regards, we developed University Short courses web application which is specially designed for Faculty of computing and information technology, Rabigh, Kingdom of Saudi Arabia. After an elaborate review of the current state-of-the-art methods of teaching and learning, we found that instructors deliver extra short courses and workshop to students to enhance the knowledge of students. Moreover, this process is completely manual. The prevailing methods of teaching and learning consume a lot of time; therefore in this context, University Short courses web application will help to make process easy and user friendly. The site allows for students can view and register short courses online conducted by instructor also they can see courses starting dates, finishing date and locations. It also allows the instructor to put things on his courses on the site and see the students enrolled in the study material. Finally, student can print the certificate after finished the course online. ASP.NET, SQLSERVER, JavaScript SQL SERVER Database will use to develop the University Short Courses web application.

Keywords: e-learning, short courses, ASP.NET, SQL SERVER

Procedia PDF Downloads 118
22664 Product Feature Modelling for Integrating Product Design and Assembly Process Planning

Authors: Baha Hasan, Jan Wikander

Abstract:

This paper describes a part of the integrating work between assembly design and assembly process planning domains (APP). The work is based, in its first stage, on modelling assembly features to support APP. A multi-layer architecture, based on feature-based modelling, is proposed to establish a dynamic and adaptable link between product design using CAD tools and APP. The proposed approach is based on deriving “specific function” features from the “generic” assembly and form features extracted from the CAD tools. A hierarchal structure from “generic” to “specific” and from “high level geometrical entities” to “low level geometrical entities” is proposed in order to integrate geometrical and assembly data extracted from geometrical and assembly modelers to the required processes and resources in APP. The feature concept, feature-based modelling, and feature recognition techniques are reviewed.

Keywords: assembly feature, assembly process planning, feature, feature-based modelling, form feature, ontology

Procedia PDF Downloads 295
22663 Developing Fire Risk Factors for Existing Small-Scale Hospitals

Authors: C. L. Wu, W. W. Tseng

Abstract:

From the National Health Insurance (NHI) system was introduced in Taiwan in 2000, there have been some problems in transformed small-scale hospitals, such as mobility of patients, shortage of nursing staff, medical pipelines breaking fire compartments and insufficient fire protection systems. Due to shrinking of the funding scale and the aging society, fire safety in small-scale hospitals has recently given cause for concern. The aim of this study is to determine fire risk index for small-scale hospital through a systematic approach The selection of fire safety mitigation methods can be regarded as a multi-attribute decision making process which must be guaranteed by expert groups. First of all, identify and select safety related factors and identify evaluation criteria through literature reviews and experts group. Secondly, application of the Fuzzy Analytic Hierarchy Process method is used to ascertain a weighted value which enables rating of the importance each of the selected factors. Overall, Sprinkler type and Compartmentation are the most crucial indices in mitigating fire, that is to say, structural approach play an important role to decrease losses in fire events.

Keywords: Fuzzy Delphi Method, fuzzy analytic hierarchy, process risk assessment, fire events

Procedia PDF Downloads 434
22662 The Nexus between Manpower Training and Corporate Compliance

Authors: Timothy Wale Olaosebikan

Abstract:

The most active resource in any organization is the manpower. Every other resource remains inactive unless there is competent manpower to handle them. Manpower training is needed to enhance productivity and overall performance of the organizations. This is due to the recognition of the important role of manpower training in attainment of organizational goals. Corporate Compliance conjures visions of an incomprehensible matrix of laws and regulations that defy logic and control by even the most seasoned manpower training professionals. Similarly, corporate compliance can be viewed as one of the most significant problems faced in manpower training process for any organization, therefore, commands relevant attention and comprehension. Consequently, this study investigated the nexus between manpower training and corporate compliance. Collection of data for the study was effected through the use of questionnaire with a sample size of 265 drawn by stratified random sampling. The data were analyzed using descriptive and inferential statistics. The findings of the study show that about 75% of the respondents agree that there is a strong relationship between manpower training and corporate compliance, which brings out the organizational attainment from any training process. The findings further show that most organisation do not totally comply with the rules guiding manpower training process thereby making the process less effective on organizational performance, which may affect overall profitability. The study concludes that formulation and compliance of adequate rules and guidelines for manpower trainings will produce effective results for both employees and the organization at large. The study recommends that leaders of organizations, industries, and institutions must ensure total compliance on the part of both the employees and the organization to manpower training rules. Organizations and stakeholders should also ensure that strict policies on corporate compliance to manpower trainings form the heart of their cardinal mission.

Keywords: corporate compliance, manpower training, nexus, rules and guidelines

Procedia PDF Downloads 125
22661 The Planner's Pentangle: A Proposal for a 21st-Century Model of Planning for Sustainable Development

Authors: Sonia Hirt

Abstract:

The Planner's Triangle, an oft-cited model that visually defined planning as the search for sustainability to balance the three basic priorities of equity, economy, and environment, has influenced planning theory and practice for a quarter of a century. In this essay, we argue that the triangle requires updating and expansion. Even if planners keep sustainability as their key core aspiration at the center of their imaginary geometry, the triangle's vertices have to be rethought. Planners should move on to a 21st-century concept. We propose a Planner's Pentangle with five basic priorities as vertices of a new conceptual polygon. These five priorities are Wellbeing, Equity, Economy, Environment, and Esthetics (WE⁴). The WE⁴ concept more accurately and fully represents planning’s history. This is especially true in the United States, where public art and public health played pivotal roles in the establishment of the profession in the late 19th and early 20th centuries. It also more accurately represents planning’s future. Both health/wellness and aesthetic concerns are becoming increasingly important in the 21st century. The pentangle can become an effective tool for understanding and visualizing planning's history and present. Planning has a long history of representing urban presents and future as conceptual models in visual form. Such models can play an important role in understanding and shaping practice. For over two decades, one such model, the Planner's Triangle, stood apart as the expression of planning's pursuit for sustainability. But if the model is outdated and insufficiently robust, it can diminish our understanding of planning practice, as well as the appreciation of the profession among non-planners. Thus, we argue for a new conceptual model of what planners do.

Keywords: sustainable development, planning for sustainable development, planner's triangle, planner's pentangle, planning and health, planning and art, planning history

Procedia PDF Downloads 123
22660 Optimization of Oxygen Plant Parameters Simulating with MATLAB

Authors: B. J. Sonani, J. K. Ratnadhariya, Srinivas Palanki

Abstract:

Cryogenic engineering is the fast growing branch of the modern technology. There are various applications of the cryogenic engineering such as liquefaction in gas industries, metal industries, medical science, space technology, and transportation. The low-temperature technology developed superconducting materials which lead to reduce the friction and wear in various components of the systems. The liquid oxygen, hydrogen and helium play vital role in space application. The liquefaction process is produced very low temperature liquid for various application in research and modern application. The air liquefaction system for oxygen plants in gas industries is based on the Claude cycle. The effect of process parameters on the overall system is difficult to be analysed by manual calculations, and this provides the motivation to use process simulators for understanding the steady state and dynamic behaviour of such systems. The parametric study of this system via MATLAB simulations provide useful guidelines for preliminary design of air liquefaction system based on the Claude cycle. Every organization is always trying for reduce the cost and using the optimum performance of the plant for the staying in the competitive market.

Keywords: cryogenic, liquefaction, low -temperature, oxygen, claude cycle, optimization, MATLAB

Procedia PDF Downloads 310
22659 Health Percentage Evaluation for Satellite Electrical Power System Based on Linear Stresses Accumulation Damage Theory

Authors: Lin Wenli, Fu Linchun, Zhang Yi, Wu Ming

Abstract:

To meet the demands of long-life and high-intelligence for satellites, the electrical power system should be provided with self-health condition evaluation capability. Any over-stress events in operations should be recorded. Based on Linear stresses accumulation damage theory, accumulative damage analysis was performed on thermal-mechanical-electrical united stresses for three components including the solar array, the batteries and the power conditioning unit. Then an overall health percentage evaluation model for satellite electrical power system was built. To obtain the accurate quantity for system health percentage, an automatic feedback closed-loop correction method for all coefficients in the evaluation model was present. The evaluation outputs could be referred as taking earlier fault-forecast and interventions for Ground Control Center or Satellites self.

Keywords: satellite electrical power system, health percentage, linear stresses accumulation damage, evaluation model

Procedia PDF Downloads 381
22658 Spectral Analysis Applied to Variables of Oil Wells Profiling

Authors: Suzana Leitão Russo, Mayara Laysa de Oliveira Silva, José Augusto Andrade Filho, Vitor Hugo Simon

Abstract:

Currently, seismic methods and prospecting methods are commonly applied in the oil industry and, according to the information reported every day; oil is a source of non-renewable energy. It is easier to understand why the ownership of areas of oil extraction is coveted by many nations. It is necessary to think about ways that will enable the maximization of oil production. The technique of spectral analysis can be used to analyze the behavior of the variables already defined in oil well the profile. The main objective is to verify the series dependence of variables, and to model the variables using the frequency domain to observe the model residuals.

Keywords: oil, well, spectral analysis, oil extraction

Procedia PDF Downloads 515
22657 Improvements in Double Q-Learning for Anomalous Radiation Source Searching

Authors: Bo-Bin Xiaoa, Chia-Yi Liua

Abstract:

In the task of searching for anomalous radiation sources, personnel holding radiation detectors to search for radiation sources may be exposed to unnecessary radiation risk, and automated search using machines becomes a required project. The research uses various sophisticated algorithms, which are double Q learning, dueling network, and NoisyNet, of deep reinforcement learning to search for radiation sources. The simulation environment, which is a 10*10 grid and one shielding wall setting in it, improves the development of the AI model by training 1 million episodes. In each episode of training, the radiation source position, the radiation source intensity, agent position, shielding wall position, and shielding wall length are all set randomly. The three algorithms are applied to run AI model training in four environments where the training shielding wall is a full-shielding wall, a lead wall, a concrete wall, and a lead wall or a concrete wall appearing randomly. The 12 best performance AI models are selected by observing the reward value during the training period and are evaluated by comparing these AI models with the gradient search algorithm. The results show that the performance of the AI model, no matter which one algorithm, is far better than the gradient search algorithm. In addition, the simulation environment becomes more complex, the AI model which applied Double DQN combined Dueling and NosiyNet algorithm performs better.

Keywords: double Q learning, dueling network, NoisyNet, source searching

Procedia PDF Downloads 97
22656 Evaluation of Hard Rocks Destruction Effectiveness at Drilling

Authors: Ekaterina Leusheva, Valentin Morenov

Abstract:

Well drilling in hard rocks is coupled with high energy demands which negates the speed of the process and thus reduces overall effectiveness. Aim of this project is to develop the technique of experimental research, which would allow to select optimal washing fluid composition while adding special hardness reducing detergent reagents. Based on the analysis of existing references and conducted experiments, technique dealing with quantitative evaluation of washing fluid weakening influence on drilled rocks was developed, which considers laboratory determination of three mud properties (density, surface tension, specific electrical resistance) and three rock properties (ultimate stress, dynamic strength, micro-hardness). Developed technique can be used in the well drilling technologies and particularly while creating new compositions of drilling muds for increased destruction effectiveness of hard rocks. It can be concluded that given technique introduces coefficient of hard rocks destruction effectiveness that allows quantitative evaluation of different drilling muds on the drilling process to be taken. Correct choice of drilling mud composition with hardness reducing detergent reagents will increase drilling penetration rate and drill meterage per bit.

Keywords: detergent reagents, drilling mud, drilling process stimulation, hard rocks

Procedia PDF Downloads 532
22655 Subspace Rotation Algorithm for Implementing Restricted Hopfield Network as an Auto-Associative Memory

Authors: Ci Lin, Tet Yeap, Iluju Kiringa

Abstract:

This paper introduces the subspace rotation algorithm (SRA) to train the Restricted Hopfield Network (RHN) as an auto-associative memory. Subspace rotation algorithm is a gradient-free subspace tracking approach based on the singular value decomposition (SVD). In comparison with Backpropagation Through Time (BPTT) on training RHN, it is observed that SRA could always converge to the optimal solution and BPTT could not achieve the same performance when the model becomes complex, and the number of patterns is large. The AUTS case study showed that the RHN model trained by SRA could achieve a better structure of attraction basin with larger radius(in general) than the Hopfield Network(HNN) model trained by Hebbian learning rule. Through learning 10000 patterns from MNIST dataset with RHN models with different number of hidden nodes, it is observed that an several components could be adjusted to achieve a balance between recovery accuracy and noise resistance.

Keywords: hopfield neural network, restricted hopfield network, subspace rotation algorithm, hebbian learning rule

Procedia PDF Downloads 102
22654 Proteomic Analysis of Excretory Secretory Antigen (ESA) from Entamoeba histolytica HM1: IMSS

Authors: N. Othman, J. Ujang, M. N. Ismail, R. Noordin, B. H. Lim

Abstract:

Amoebiasis is caused by the Entamoeba histolytica and still endemic in many parts of the tropical region, worldwide. Currently, there is no available vaccine against amoebiasis. Hence, there is an urgent need to develop a vaccine. The excretory secretory antigen (ESA) of E. histolytica is a suitable biomarker for the vaccine candidate since it can modulate the host immune response. Hence, the objective of this study is to identify the proteome of the ESA towards finding suitable biomarker for the vaccine candidate. The non-gel based and gel-based proteomics analyses were performed to identify proteins. Two kinds of mass spectrometry with different ionization systems were utilized i.e. LC-MS/MS (ESI) and MALDI-TOF/TOF. Then, the functional proteins classification analysis was performed using PANTHER software. Combination of the LC -MS/MS for the non-gel based and MALDI-TOF/TOF for the gel-based approaches identified a total of 273 proteins from the ESA. Both systems identified 29 similar proteins whereby 239 and 5 more proteins were identified by LC-MS/MS and MALDI-TOF/TOF, respectively. Functional classification analysis showed the majority of proteins involved in the metabolic process (24%), primary metabolic process (19%) and protein metabolic process (10%). Thus, this study has revealed the proteome the E. histolytica ESA and the identified proteins merit further investigations as a vaccine candidate.

Keywords: E. histolytica, ESA, proteomics, biomarker

Procedia PDF Downloads 324
22653 Similar Script Character Recognition on Kannada and Telugu

Authors: Gurukiran Veerapur, Nytik Birudavolu, Seetharam U. N., Chandravva Hebbi, R. Praneeth Reddy

Abstract:

This work presents a robust approach for the recognition of characters in Telugu and Kannada, two South Indian scripts with structural similarities in characters. To recognize the characters exhaustive datasets are required, but there are only a few publicly available datasets. As a result, we decided to create a dataset for one language (source language),train the model with it, and then test it with the target language.Telugu is the target language in this work, whereas Kannada is the source language. The suggested method makes use of Canny edge features to increase character identification accuracy on pictures with noise and different lighting. A dataset of 45,150 images containing printed Kannada characters was created. The Nudi software was used to automatically generate printed Kannada characters with different writing styles and variations. Manual labelling was employed to ensure the accuracy of the character labels. The deep learning models like CNN (Convolutional Neural Network) and Visual Attention neural network (VAN) are used to experiment with the dataset. A Visual Attention neural network (VAN) architecture was adopted, incorporating additional channels for Canny edge features as the results obtained were good with this approach. The model's accuracy on the combined Telugu and Kannada test dataset was an outstanding 97.3%. Performance was better with Canny edge characteristics applied than with a model that solely used the original grayscale images. The accuracy of the model was found to be 80.11% for Telugu characters and 98.01% for Kannada words when it was tested with these languages. This model, which makes use of cutting-edge machine learning techniques, shows excellent accuracy when identifying and categorizing characters from these scripts.

Keywords: base characters, modifiers, guninthalu, aksharas, vattakshara, VAN

Procedia PDF Downloads 37
22652 Determination of Anchor Lengths by Retaining Walls

Authors: Belabed Lazhar

Abstract:

The dimensioning of the anchored retaining screens passes always by the analysis of their stability. The calculation of anchoring lengths is practically carried out according to the mechanical model suggested by Kranz which is often criticized. The safety is evaluated through the comparison of interior force and external force. The force of anchoring over the length cut behind the failure solid is neglected. The failure surface cuts anchoring in the medium length of sealing. In this article, one proposes a new mechanical model which overcomes these disadvantages (simplifications) and gives interesting results.

Keywords: retaining walls, anchoring, stability, mechanical modeling, safety

Procedia PDF Downloads 340
22651 Human Facial Emotion: A Comparative and Evolutionary Perspective Using a Canine Model

Authors: Catia Correia Caeiro, Kun Guo, Daniel Mills

Abstract:

Despite its growing interest, emotions are still an understudied cognitive process and their origins are currently the focus of much debate among the scientific community. The use of facial expressions as traditional hallmarks of discrete and holistic emotions created a circular reasoning due to a priori assumptions of meaning and its associated appearance-biases. Ekman and colleagues solved this problem and laid the foundations for the quantitative and systematic study of facial expressions in humans by developing an anatomically-based system (independent from meaning) to measure facial behaviour, the Facial Action Coding System (FACS). One way of investigating emotion cognition processes is by applying comparative psychology methodologies and looking at either closely-related species (e.g. chimpanzees) or phylogenetically distant species sharing similar present adaptation problems (analogy). In this study, the domestic dog was used as a comparative animal model to look at facial expressions in social interactions in parallel with human facial expressions. The orofacial musculature seems to be relatively well conserved across mammal species and the same holds true for the domestic dog. Furthermore, the dog is unique in having shared the same social environment as humans for more than 10,000 years, facing similar challenges and acquiring a unique set of socio-cognitive skills in the process. In this study, the spontaneous facial movements of humans and dogs were compared when interacting with hetero- and conspecifics as well as in solitary contexts. In total, 200 participants were examined with FACS and DogFACS (The Dog Facial Action Coding System): coding tools across four different emotionally-driven contexts: a) Happiness (play and reunion), b) anticipation (of positive reward), c) fear (object or situation triggered), and d) frustration (negation of a resource). A neutral control was added for both species. All four contexts are commonly encountered by humans and dogs, are comparable between species and seem to give rise to emotions from homologous brain systems. The videos used in the study were extracted from public databases (e.g. Youtube) or published scientific databases (e.g. AM-FED). The results obtained allowed us to delineate clear similarities and differences on the flexibility of the facial musculature in the two species. More importantly, they shed light on what common facial movements are a product of the emotion linked contexts (the ones appearing in both species) and which are characteristic of the species, revealing an important clue for the debate on the origin of emotions. Additionally, we were able to examine movements that might have emerged for interspecific communication. Finally, our results are discussed from an evolutionary perspective adding to the recent line of work that supports an ancient shared origin of emotions in a mammal ancestor and defining emotions as mechanisms with a clear adaptive purpose essential on numerous situations, ranging from maintenance of social bonds to fitness and survival modulators.

Keywords: comparative and evolutionary psychology, emotion, facial expressions, FACS

Procedia PDF Downloads 418
22650 Study on the Process of Detumbling Space Target by Laser

Authors: Zhang Pinliang, Chen Chuan, Song Guangming, Wu Qiang, Gong Zizheng, Li Ming

Abstract:

The active removal of space debris and asteroid defense are important issues in human space activities. Both of them need a detumbling process, for almost all space debris and asteroid are in a rotating state, and it`s hard and dangerous to capture or remove a target with a relatively high tumbling rate. So it`s necessary to find a method to reduce the angular rate first. The laser ablation method is an efficient way to tackle this detumbling problem, for it`s a contactless technique and can work at a safe distance. In existing research, a laser rotational control strategy based on the estimation of the instantaneous angular velocity of the target has been presented. But their calculation of control torque produced by a laser, which is very important in detumbling operation, is not accurate enough, for the method they used is only suitable for the plane or regularly shaped target, and they did not consider the influence of irregular shape and the size of the spot. In this paper, based on the triangulation reconstruction of the target surface, we propose a new method to calculate the impulse of the irregularly shaped target under both the covered irradiation and spot irradiation of the laser and verify its accuracy by theoretical formula calculation and impulse measurement experiment. Then we use it to study the process of detumbling cylinder and asteroid by laser. The result shows that the new method is universally practical and has high precision; it will take more than 13.9 hours to stop the rotation of Bennu with 1E+05kJ laser pulse energy; the speed of the detumbling process depends on the distance between the spot and the centroid of the target, which can be found an optimal value in every particular case.

Keywords: detumbling, laser ablation drive, space target, space debris remove

Procedia PDF Downloads 70
22649 Train Timetable Rescheduling Using Sensitivity Analysis: Application of Sobol, Based on Dynamic Multiphysics Simulation of Railway Systems

Authors: Soha Saad, Jean Bigeon, Florence Ossart, Etienne Sourdille

Abstract:

Developing better solutions for train rescheduling problems has been drawing the attention of researchers for decades. Most researches in this field deal with minor incidents that affect a large number of trains due to cascading effects. They focus on timetables, rolling stock and crew duties, but do not take into account infrastructure limits. The present work addresses electric infrastructure incidents that limit the power available for train traction, and hence the transportation capacity of the railway system. Rescheduling is needed in order to optimally share the available power among the different trains. We propose a rescheduling process based on dynamic multiphysics railway simulations that include the mechanical and electrical properties of all the system components and calculate physical quantities such as the train speed profiles, voltage along the catenary lines, temperatures, etc. The optimization problem to solve has a large number of continuous and discrete variables, several output constraints due to physical limitations of the system, and a high computation cost. Our approach includes a phase of sensitivity analysis in order to analyze the behavior of the system and help the decision making process and/or more precise optimization. This approach is a quantitative method based on simulation statistics of the dynamic railway system, considering a predefined range of variation of the input parameters. Three important settings are defined. Factor prioritization detects the input variables that contribute the most to the outputs variation. Then, factor fixing allows calibrating the input variables which do not influence the outputs. Lastly, factor mapping is used to study which ranges of input values lead to model realizations that correspond to feasible solutions according to defined criteria or objectives. Generalized Sobol indexes are used for factor prioritization and factor fixing. The approach is tested in the case of a simple railway system, with a nominal traffic running on a single track line. The considered incident is the loss of a feeding power substation, which limits the power available and the train speed. Rescheduling is needed and the variables to be adjusted are the trains departure times, train speed reduction at a given position and the number of trains (cancellation of some trains if needed). The results show that the spacing between train departure times is the most critical variable, contributing to more than 50% of the variation of the model outputs. In addition, we identify the reduced range of variation of this variable which guarantees that the output constraints are respected. Optimal solutions are extracted, according to different potential objectives: minimizing the traveling time, the train delays, the traction energy, etc. Pareto front is also built.

Keywords: optimization, rescheduling, railway system, sensitivity analysis, train timetable

Procedia PDF Downloads 387