Search results for: employees input
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3306

Search results for: employees input

1236 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images

Authors: Shahriar Farzam, Maryam Rastgarpour

Abstract:

Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).

Keywords: curvelet transform, CBCT, image enhancement, image denoising

Procedia PDF Downloads 292
1235 Technoeustress in Higher Education Teachers: A Study on Positive Stress

Authors: Ligia Nascimento, Manuela Faia Correia

Abstract:

Nowadays, Information and Communication Technologies (ICT) are embedded in most professions. Technostress - or stress induced by the use of ICTs, has been studied in various sectors of activity and in different geographical areas, mainly from the perspective of its harmful impacts. In the context of work, the technological contexts capable of causing stress have been examined in-depth, as well as the type of individuals most likely to experience its negative effects. However, new lines of the research argue that the stress generated by the use of ICTs may not necessarily be detrimental (technodistress), admitting that, in contrast, and in addition, it may actually be beneficial to organizations and their employees (technoeustress). Any measures that succeed in reducing technodistress do not necessarily lead to the creation of technoeustress, justifying the study of this phenomenon in a focused and independent manner. Adopting the transactional model of stress as the basic theoretical framework, an ongoing research project aims to study technoeustress independently. Given the role played in the qualification and progress of society and the economy, it becomes particularly critical to care for the well-being of the higher education teacher. Particularly in recent times, when teleworking is prevalent, these professionals have made a huge, compulsive effort to adapt to a new teaching reality. Rather than limiting itself to mitigating adverse effects of ICT use, which featured earlier approaches, the present study seeks to understand how to activate the positive side of technostress in higher education teachers in order to obtain favorable personal and organizational outcomes from ICT use at work. The research model seeks to understand, upstream, the ICT characteristics that increase the perception of technoeustress among higher education teachers, studying the direct and moderating effects of individual and organizational variables and, downstream, the impacts that technoeustress has on job satisfaction and performance. This research contributes both to expanding the knowledge of the technostress phenomenon and to identify possible recommendations for management.

Keywords: higher education teachers, ICT, stress, technoeustress

Procedia PDF Downloads 140
1234 Implant Operation Guiding Device for Dental Surgeons

Authors: Daniel Hyun

Abstract:

Dental implants are one of the top 3 reasons to sue a dentist for malpractice. It involves dental implant complications, usually because of the angle of the implant from the surgery. At present, surgeons usually use a 3D-printed navigator that is customized for the patient’s teeth. However, those can’t be reused for other patients as they require time. Therefore, I made a guiding device to assist the surgeon in implant operations. The surgeon can input the objective of the operation, and the device constantly checks if the surgery is heading towards the objective within the set range, telling the surgeon by manipulating the LED. We tested the prototypes’ consistency and accuracy by checking the graph, average standard deviation, and the average change of the calculated angles. The accuracy of performance was also acquired by running the device and checking the outputs. My first prototype used accelerometer and gyroscope sensors from the Arduino MPU6050 sensor, getting a changeable graph, achieving 0.0295 of standard deviations, 0.25 of average change, and 66.6% accuracy of performance. The second prototype used only the gyroscope, and it got a constant graph, achieved 0.0062 of standard deviation, 0.075 of average change, and 100% accuracy of performance, indicating that the accelerometer sensor aggravated the functionality of the device. Using the gyroscope sensor allowed it to measure the orientations of separate axes without affecting each other and also increased the stability and accuracy of the measurements.

Keywords: implant, guide, accelerometer, gyroscope, handpiece

Procedia PDF Downloads 36
1233 Human Action Recognition Using Variational Bayesian HMM with Dirichlet Process Mixture of Gaussian Wishart Emission Model

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we present the human action recognition method using the variational Bayesian HMM with the Dirichlet process mixture (DPM) of the Gaussian-Wishart emission model (GWEM). First, we define the Bayesian HMM based on the Dirichlet process, which allows an infinite number of Gaussian-Wishart components to support continuous emission observations. Second, we have considered an efficient variational Bayesian inference method that can be applied to drive the posterior distribution of hidden variables and model parameters for the proposed model based on training data. And then we have derived the predictive distribution that may be used to classify new action. Third, the paper proposes a process of extracting appropriate spatial-temporal feature vectors that can be used to recognize a wide range of human behaviors from input video image. Finally, we have conducted experiments that can evaluate the performance of the proposed method. The experimental results show that the method presented is more efficient with human action recognition than existing methods.

Keywords: human action recognition, Bayesian HMM, Dirichlet process mixture model, Gaussian-Wishart emission model, Variational Bayesian inference, prior distribution and approximate posterior distribution, KTH dataset

Procedia PDF Downloads 345
1232 Interpretable Deep Learning Models for Medical Condition Identification

Authors: Dongping Fang, Lian Duan, Xiaojing Yuan, Mike Xu, Allyn Klunder, Kevin Tan, Suiting Cao, Yeqing Ji

Abstract:

Accurate prediction of a medical condition with straight clinical evidence is a long-sought topic in the medical management and health insurance field. Although great progress has been made with machine learning algorithms, the medical community is still, to a certain degree, suspicious about the model's accuracy and interpretability. This paper presents an innovative hierarchical attention deep learning model to achieve good prediction and clear interpretability that can be easily understood by medical professionals. This deep learning model uses a hierarchical attention structure that matches naturally with the medical history data structure and reflects the member’s encounter (date of service) sequence. The model attention structure consists of 3 levels: (1) attention on the medical code types (diagnosis codes, procedure codes, lab test results, and prescription drugs), (2) attention on the sequential medical encounters within a type, (3) attention on the medical codes within an encounter and type. This model is applied to predict the occurrence of stage 3 chronic kidney disease (CKD3), using three years’ medical history of Medicare Advantage (MA) members from a top health insurance company. The model takes members’ medical events, both claims and electronic medical record (EMR) data, as input, makes a prediction of CKD3 and calculates the contribution from individual events to the predicted outcome. The model outcome can be easily explained with the clinical evidence identified by the model algorithm. Here are examples: Member A had 36 medical encounters in the past three years: multiple office visits, lab tests and medications. The model predicts member A has a high risk of CKD3 with the following well-contributed clinical events - multiple high ‘Creatinine in Serum or Plasma’ tests and multiple low kidneys functioning ‘Glomerular filtration rate’ tests. Among the abnormal lab tests, more recent results contributed more to the prediction. The model also indicates regular office visits, no abnormal findings of medical examinations, and taking proper medications decreased the CKD3 risk. Member B had 104 medical encounters in the past 3 years and was predicted to have a low risk of CKD3, because the model didn’t identify diagnoses, procedures, or medications related to kidney disease, and many lab test results, including ‘Glomerular filtration rate’ were within the normal range. The model accurately predicts members A and B and provides interpretable clinical evidence that is validated by clinicians. Without extra effort, the interpretation is generated directly from the model and presented together with the occurrence date. Our model uses the medical data in its most raw format without any further data aggregation, transformation, or mapping. This greatly simplifies the data preparation process, mitigates the chance for error and eliminates post-modeling work needed for traditional model explanation. To our knowledge, this is the first paper on an interpretable deep-learning model using a 3-level attention structure, sourcing both EMR and claim data, including all 4 types of medical data, on the entire Medicare population of a big insurance company, and more importantly, directly generating model interpretation to support user decision. In the future, we plan to enrich the model input by adding patients’ demographics and information from free-texted physician notes.

Keywords: deep learning, interpretability, attention, big data, medical conditions

Procedia PDF Downloads 88
1231 The Service Appraisal of Soldiers of the Army of the Czech Republic in the Context of Personal Expenses

Authors: Tereza Dolečková

Abstract:

Following article provides the comparison of international norms and standards formulating personal expenses, and then it illustrates the national concept of personal expenses of the Ministry of Defence. Then a new salary system of soldiers and the importance of the service appraisal in the context of personal expenses of the Ministry of Defence are explained. The first part of the article includes formulation of the approach to the definition of personal expenses within the international norms and standards and also within the Ministry of Defence of the Czech Republic. The structure of employees of the Ministry of Defence of the Czech Republic in years 2012 – 2014 and the amount of military expenses and the share of salary expenses of the Ministry of total expenses of the Ministry are clarified there, also the comparison of the amount of military expenses in chosen member states of the North Atlantic Treaty Organization is done. The salary system of professional soldiers in connection with the amendment of the Act No. 221/1999 Coll. on Professional Soldiers is clarified in the second part of this article. The amendment significantly regulates the salary items of soldiers but changes are also in the service appraisal of soldiers which reflects one of seven salary items of soldiers – the performance bonus. The aim of this article is to clarify different approach to define personal expenses with emphasis on the Ministry of Defence of the Czech Republic which overlaps to the service appraisal of soldiers of the Army of the Czech Republic and their salary system in connection with personal expenses of the Ministry of Defence of the Czech Republic. The efficient and objective system of the service appraisal and the use of its results are connected to the principles of the career advancement; only the best soldiers can advance in the system of the service careers to higher positions. That is why it is necessary to improve the service appraisal so it would provide the maximum information about the performance of a soldier and it would also motivate the soldier in his development. The attention should be paid to the service appraisal of the soldiers of the Army of the Czech Republic to achieve as much objectivity as possible.

Keywords: career, human resource management and development, personal expenses, salary system of soldiers, service appraisal of soldiers, the Army of the Czech Republic

Procedia PDF Downloads 239
1230 The Analysis of Regulation on Sustainability in the Financial Sector in Lithuania

Authors: Dalia Kubiliūtė

Abstract:

Lithuania is known as a trusted location for global business institutions, and it attracts investors with it’s competitive environment for financial service providers. Along with the aspiration to offer a strong results-oriented and innovations-driven environment for financial service providers, Lithuanian regulatory authorities consistently implement the European Union's high regulatory standards for financial activities, including sustainability-related disclosures. Since European Union directed its policy towards transition to a climate-neutral, green, competitive, and inclusive economy, additional regulatory requirements for financial market participants are adopted: disclosure of sustainable activities, transparency, prevention of greenwashing, etc. The financial sector is one of the key factors influencing the implementation of sustainability objectives in European Union policies and mitigating the negative effects of climate change –public funds are not enough to make a significant impact on sustainable investments, therefore directing public and private capital to green projects may help to finance the necessary changes. The topic of the study is original and has not yet been widely analyzed in Lithuanian legal discourse. There are used quantitative and qualitative methodologies, logical, systematic, and critical analysis principles; hence the aim of this study is to reveal the problem of the implementation of the regulation on sustainability in the Lithuanian financial sector. Additional regulatory requirements could cause serious changes in financial business operations: additional funds, employees, and time have to be dedicated in order for the companies could implement these regulations. Lack of knowledge and data on how to implement new regulatory requirements towards sustainable reporting causes a lot of uncertainty for financial market participants. And for some companies, it might even be an essential point in terms of business continuity. It is considered that the supervisory authorities should find a balance between financial market needs and legal regulation.

Keywords: financial, legal, regulatory, sustainability

Procedia PDF Downloads 96
1229 Fire Characteristic of Commercial Retardant Flame Polycarbonate under Different Oxygen Concentration: Ignition Time and Heat Blockage

Authors: Xuelin Zhang, Shouxiang Lu, Changhai Li

Abstract:

The commercial retardant flame polycarbonate samples as the main high speed train interior carriage material with different thicknesses were investigated in Fire Propagation Apparatus with different external heat fluxes under different oxygen concentration from 12% to 40% to study the fire characteristics and quantitatively analyze the ignition time, mass loss rate and heat blockage. The additives of commercial retardant flame polycarbonate were intumescent and maintained a steady height before ignition when heated. The results showed the transformed ignition time (1/t_ig)ⁿ increased linearly with external flux under different oxygen concentration after deducting the heat blockage due to pyrolysis products, the mass loss rate was taken on linearly with external heat fluxes and the slop of the fitting line for mass loss rate and external heat fluxes decreased with the enhanced oxygen concentration and the heat blockage independent on external heat fluxes rose with oxygen concentration increasing. The inquired data as the input of the fire simulation model was the most important to be used to evaluate the fire risk of commercial retardant flame polycarbonate.

Keywords: ignition time, mass loss rate, heat blockage, fire characteristic

Procedia PDF Downloads 279
1228 Educational Experiences in Engineering in the COVID Era and Their Comparative Analysis, Spain, March to June 2020

Authors: Borja Bordel, Ramón Alcarria, Marina Pérez

Abstract:

In March 2020, in Spain, a sanitary and unexpected crisis caused by COVID-19 was declared. All of a sudden, all degrees, classes and evaluation tests and projects had to be transformed into online activities. However, the chaotic situation generated by a complex operation like that, executed without any well-established procedure, led to very different experiences and, finally, results. In this paper, we are describing three experiences in two different Universities in Madrid. On the one hand, the Technical University of Madrid, a public university with little experience in online education. On the other hand, Alfonso X el Sabio University, a private university with more than five years of experience in online teaching. All analyzed subjects were related to computer engineering. Professors and students answered a survey and personal interviews were also carried out. Besides, the professors’ workload and the students’ academic results were also compared. From the comparative analysis of all these experiences, we are extracting the most successful strategies, methodologies, and activities. The recommendations in this paper will be useful for courses during the next months when the sanitary situation is still affecting an educational organization. While, at the same time, they will be considered as input for the upcoming digitalization process of higher education.

Keywords: educational experience, online education, higher education digitalization, COVID, Spain

Procedia PDF Downloads 135
1227 Building Successful Organizational Business Communication and Its Impact on Business Performance: An Intra- and Inter-Organizational Perspective

Authors: Aynura Valiyeva, Basil John Thomas

Abstract:

Intra-firm communication is critical for building synergy amongst internal business units of a firm, where employees from various functional departments and ranks incorporate their decision-making, understanding of organizational objectives, as well as common norms and culture for better organizational effectiveness. This study builds on and assesses a framework of the causes and consequences of effective communication in business interactions between customer and supplier firms, and the path for efficient communication within a firm. The proposed study’s structural equation modeling (SEM) analysis based on 352 sample responses collected from firm representatives at different job positions ranging from marketing to logistics operations, reveals that, in the frame of reference of intra-organizational communication, organization characteristics and shared values, top management support and style of leadership, as well as information technology, are all significantly related to communication effectiveness. Furthermore, the frequency and variety of interactions enhance the outcome of communication, that improves a company’s performance. The results reveal that cultural factors are significantly related to communication effectiveness, as well as the shared beliefs and goals. In terms of organizational factors, leadership style, top management support and information technology are significant determinants of effective communication. Among the contextual factors, interaction frequency and diversity are found to be priority factors. This study also tests the relationship between supplier and supplier firm performance in the context of communication effectiveness, and finds that they are closely related, when trust and commitment is built between business partners. When firms do business in other multicultural contexts, language and shared values with destination country must be considered significant elements of communication process.

Keywords: business performance, intra-firm communication, inter-firm communication, structural equation modeling

Procedia PDF Downloads 92
1226 The Impact of Organizational Culture on Internet Marketing Adoption

Authors: Hafiz Mushtaq Ahmad, Syed Faizan Ali Shah, Bushra Hussain, Muneeb Iqbal

Abstract:

Purpose: The purpose of this study is to investigate the impact of organizational culture on internet marketing adoption. Moreover, the study intends to explore the role of organizational culture in the internet marketing adoption that helps business to achieve organizational growth and augmented market share. Background: With the enormous expansion of technology, organizations now need technology-based marketing paradigm in order to capture larger group of customers. Organizational culture plays a dominant and prominent role in the internet marketing adoption. Changes in the world economy have demolished current organizational competition and generating new technology standards and strategies. With all the technological advances, e-marketing has become one of the essential part of marketing strategies. Organizations require advance internet marketing strategies in order to compete in a global market. Methodology: The population of this study consists of telecom sector organizations of Pakistan. The sample size consists of 200 telecom sector employees. Data were gathered through the questionnaire instrument. The research strategy of this study is survey. The study uses a deductive approach. The sampling technique of this study is convenience sampling. Tentative Results: The study reveals that organizational culture played a vital role in the internet marketing adoption. The results show that there is a strong association between the organizational culture and internet marketing adoption. The results further show that flexible organizational culture helps organization to easily adopt internet marketing. Conclusion: The study discloses that flexible organizational culture helps organizations to easily adopt e-marketing. The study guides decision-makers and owners of organizations to recognize the importance of internet marketing strategy and help them to increase market share by using e-marketing. The study offers solution to the managers to develop flexible organizational culture that helps in internet marketing adoption.

Keywords: internet technology, internet marketing, marketing paradigm, organizational culture

Procedia PDF Downloads 227
1225 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida

Authors: N. Mehrotra, V. Ahuja, N. Sridharan

Abstract:

Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.

Keywords: disaster, resilience, system, urban

Procedia PDF Downloads 453
1224 Theoretical Analysis of the Solid State and Optical Characteristics of Calcium Sulpide Thin Film

Authors: Emmanuel Ifeanyi Ugwu

Abstract:

Calcium Sulphide which is one of Chalcogenide group of thin films has been analyzed in this work using a theoretical approach in which a scalar wave was propagated through the material thin film medium deposited on a glass substrate with the assumption that the dielectric medium has homogenous reference dielectric constant term, and a perturbed dielectric function, representing the deposited thin film medium on the surface of the glass substrate as represented in this work. These were substituted into a defined scalar wave equation that was solved first of all by transforming it into Volterra equation of second type and solved using the method of separation of variable on scalar wave and subsequently, Green’s function technique was introduced to obtain a model equation of wave propagating through the thin film that was invariably used in computing the propagated field, for different input wavelengths representing UV, Visible and Near-infrared regions of field considering the influence of the dielectric constants of the thin film on the propagating field. The results obtained were used in turn to compute the band gaps, solid state and optical properties of the thin film.

Keywords: scalar wave, dielectric constant, calcium sulphide, solid state, optical properties

Procedia PDF Downloads 106
1223 Stock Market Prediction Using Convolutional Neural Network That Learns from a Graph

Authors: Mo-Se Lee, Cheol-Hwi Ahn, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN (Convolutional Neural Network), which is known as effective solution for recognizing and classifying images, has been popularly applied to classification and prediction problems in various fields. In this study, we try to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. In specific, we propose to apply CNN as the binary classifier that predicts stock market direction (up or down) by using a graph as its input. That is, our proposal is to build a machine learning algorithm that mimics a person who looks at the graph and predicts whether the trend will go up or down. Our proposed model consists of four steps. In the first step, it divides the dataset into 5 days, 10 days, 15 days, and 20 days. And then, it creates graphs for each interval in step 2. In the next step, CNN classifiers are trained using the graphs generated in the previous step. In step 4, it optimizes the hyper parameters of the trained model by using the validation dataset. To validate our model, we will apply it to the prediction of KOSPI200 for 1,986 days in eight years (from 2009 to 2016). The experimental dataset will include 14 technical indicators such as CCI, Momentum, ROC and daily closing price of KOSPI200 of Korean stock market.

Keywords: convolutional neural network, deep learning, Korean stock market, stock market prediction

Procedia PDF Downloads 423
1222 Law and its Implementation and Consequences in Pakistan

Authors: Amir Shafiq, Asif Shahzad, Shabbar Mehmood, Muhammad Saeed, Hamid Mustafa

Abstract:

Legislation includes the law or the statutes which is being reputable by a sovereign authority and generally can be implemented by the courts of law time to time to accomplish the objectives. Historically speaking upon the emergence of Pakistan in 1947, the intact laws of the British Raj remained effective after ablution by Islamic Ideology. Thus, there was an intention to begin the statutes book afresh for Pakistan's legal history. In consequence thereof, the process of developing detailed plans, procedures and mechanisms to ensure legislative and regulatory requirements are achieved began keeping in view the cultural values and the local customs. This article is an input to the enduring discussion about implementing rule of law in Pakistan whereas; the rule of law requires the harmony of laws which is mostly in the arrangement of codified state laws. Pakistan has legal plural civilizations where completely different and independent systems of law like the Mohammadan law, the state law and the traditional law exist. The prevailing practiced law in Pakistan is actually the traditional law though the said law is not acknowledged by the State. This caused the main problem of the rule of law in the difference between the state laws and the cultural values. These values, customs and so-called traditional laws are the main obstacle to enforce the State law in true letter and spirit which has caused dissatisfaction of the masses and distrust upon the judicial system of the country.

Keywords: consequences, implement, law, Pakistan

Procedia PDF Downloads 429
1221 Estimations of Spectral Dependence of Tropospheric Aerosol Single Scattering Albedo in Sukhothai, Thailand

Authors: Siriluk Ruangrungrote

Abstract:

Analyses of available data from MFR-7 measurement were performed and discussed on the study of tropospheric aerosol and its consequence in Thailand. Since, ASSA (w) is one of the most important parameters for a determination of aerosol effect on radioactive forcing. Here the estimation of w was directly determined in terms of the ratio of aerosol scattering optical depth to aerosol extinction optical depth (ωscat/ωext) without any utilization of aerosol computer code models. This is of benefit for providing the elimination of uncertainty causing by the modeling assumptions and the estimation of actual aerosol input data. Diurnal w of 5 cloudless-days in winter and early summer at 5 distinct wavelengths of 415, 500, 615, 673 and 870 nm with the consideration of Rayleigh scattering and atmospheric column NO2 and Ozone contents were investigated, respectively. Besides, the tendency of spectral dependence of ω representing two seasons was observed. The characteristic of spectral results reveals that during wintertime the atmosphere of the inland rural vicinity for the period of measurement possibly dominated with a lesser amount of soil dust aerosols loading than one in early summer. Hence, the major aerosol loading particularly in summer was subject to a mixture of both soil dust and biomass burning aerosols.

Keywords: aerosol scattering optical depth, aerosol extinction optical depth, biomass burning aerosol, soil dust aerosol

Procedia PDF Downloads 402
1220 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 444
1219 Bacteriological Safety of Sachet Drinking Water Sold in Benin City, Nigeria

Authors: Stephen Olusanmi Akintayo

Abstract:

Access to safe drinking water remains a major challenge in Nigeria, and where available, the quality of the water is often in doubt. An alternative to the inadequate clean drinking water is being found in treated drinking water packaged in electrically heated sealed nylon and commonly referred to as “sachet water”. “Sachet water” is a common thing in Nigeria as the selling price is within the reach of members of the low socio- economic class and the setting up of a production unit does not require huge capital input. The bacteriological quality of selected “sachet water” stored at room temperature over a period of 56 days was determined to evaluate the safety of the sachet drinking water. Test for the detection of coliform bacteria was performed, and the result showed no coliform bacteria that indicates the absence of fecal contamination throughout 56 days. Heterotrophic plate count (HPC) was done at an interval 14 days, and the samples showed HPC between 0 cfu/mL and 64 cfu/mL. The highest count was observed on day 1. The count decreased between day 1 and 28, while no growths were observed between day 42 and 56. The decrease in HPC suggested the presence of residual disinfectant in the water. The organisms isolated were identified as Staphylococcus epidermis and S. aureus. The presence of these microorganisms in sachet water is indicative for contamination during processing and handling.

Keywords: coliform, heterotrophic plate count, sachet water, Staphyloccocus aureus, Staphyloccocus epidermidis

Procedia PDF Downloads 334
1218 The Use of Building Energy Simulation Software in Case Studies: A Literature Review

Authors: Arman Ameen, Mathias Cehlin

Abstract:

The use of Building Energy Simulation (BES) software has increased in the last two decades, parallel to the development of increased computing power and easy to use software applications. This type of software is primarily used to simulate the energy use and the indoor environment for a building. The rapid development of these types of software has raised their level of user-friendliness, better parameter input options and the increased possibility of analysis, both for a single building component or an entire building. This, in turn, has led to many researchers utilizing BES software in their research in various degrees. The aim of this paper is to carry out a literature review concerning the use of the BES software IDA Indoor Climate and Energy (IDA ICE) in the scientific community. The focus of this paper will be specifically the use of the software for whole building energy simulation, number and types of articles and publications dates, the area of application, types of parameters used, the location of the studied building, type of building, type of analysis and solution methodology. Another aspect that is examined, which is of great interest, is the method of validations regarding the simulation results. The results show that there is an upgoing trend in the use of IDA ICE and that researchers use the software in their research in various degrees depending on case and aim of their research. The satisfactory level of validation of the simulations carried out in these articles varies depending on the type of article and type of analysis.

Keywords: building simulation, IDA ICE, literature review, validation

Procedia PDF Downloads 128
1217 Basic Examination of Easily Distinguishable Tactile Symbols Attached to Containers and Packaging

Authors: T. Nishimura, K. Doi, H. Fujimoto, Y. Hoshikawa, T. Wada

Abstract:

In Japan, it is expected that reasonable accommodation for persons with disabilities will progress further. In particular, there is an urgent need to enhance information support for visually impaired persons who have difficulty accessing information. Recently, tactile symbols have been attached to various surfaces, such as the content labels of containers and packaging of various everyday products. The advantage of tactile symbols is that they are useful for visually impaired persons who cannot read Braille. The method of displaying tactile symbols is prescribed by the International Organization for Standardization (ISO). However, the quantitative data on the shapes and dimensions of tactile symbols is insufficient. In this study, through an evaluation experiments, we examine the easy-to-distinguish shapes and dimensions of tactile symbols used for various applications, including the content labels on containers and packaging. Visually impaired persons participated in the experiments. They used tactile symbols on a daily basis. The details and processes of the experiments were orally explained to the participants prior to the experiments, and the informed consent of the participants was obtained. They were instructed to touch the test pieces of tactile symbols freely with both hands. These tactile symbols were selected because they were likely to be easily distinguishable symbols on the content labels of top surfaces of containers and packaging based on a hearing survey that involved employees of an organization of visually impaired and a social welfare corporation, as well as academic experts of support technology for visually impaired. The participants then answered questions related to ease of distinguishing of tactile symbols on a scale of 5 (where 1 corresponded to ‘difficult to distinguish’ and 5 corresponded to ‘easy to distinguish’). Hearing surveys were also performed in an oral free answer manner with the participants after the experiments. This study revealed the shapes and dimensions regarding easily distinguishable tactile symbols attached to containers and packaging. We expect that this knowledge contributes to improvement of the quality of life of visually impaired persons.

Keywords: visual impairment, accessible design, tactile symbol, containers and packaging

Procedia PDF Downloads 213
1216 Reformulation of Theory of Critical Distances to Predict the Strength of Notched Plain Concrete Beams under Quasi Static Loading

Authors: Radhika V., J. M. Chandra Kishen

Abstract:

The theory of critical distances (TCD), due to its appealing characteristics, has been successfully used in the past to predict the strength of brittle as well as ductile materials, weakened by the presence of stress risers under both static and fatigue loading. By utilising most of the TCD's unique features, this paper summarises an attempt for a reformulation of the point method of the TCD to predict the strength of notched plain concrete beams under mode I quasi-static loading. A zone of micro cracks, which is responsible for the non-linearity of concrete, is taken into account considering the concept of an effective elastic crack. An attempt is also made to correlate the value of the material characteristic length required for the application of TCD with the maximum aggregate size in the concrete mix, eliminating the need for any extensive experimentation prior to the application of TCD. The devised reformulation and the proposed power law based relationship is found to yield satisfactory predictions for static strength of notched plain concrete beams, with geometric dimensions of the beam, tensile strength, and maximum aggregate size of the concrete mix being the only needed input parameters.

Keywords: characteristic length, effective elastic crack, inherent material strength, modeI loading, theory of critical distances

Procedia PDF Downloads 95
1215 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 87
1214 Identification of Training Topics for the Improvement of the Relevant Cognitive Skills of Technical Operators in the Railway Domain

Authors: Giulio Nisoli, Jonas Brüngger, Karin Hostettler, Nicole Stoller, Katrin Fischer

Abstract:

Technical operators in the railway domain are experts responsible for the supervisory control of the railway power grid as well as of the railway tunnels. The technical systems used to master these demanding tasks are constantly increasing in their degree of automation. It becomes therefore difficult for technical operators to maintain the control over the technical systems and the processes of their job. In particular, the operators must have the necessary experience and knowledge in dealing with a malfunction situation or unexpected event. For this reason, it is of growing importance that the skills relevant for the execution of the job are maintained and further developed beyond the basic training they receive, where they are educated in respect of technical knowledge and the work with guidelines. Training methods aimed at improving the cognitive skills needed by technical operators are still missing and must be developed. Goals of the present study were to identify which are the relevant cognitive skills of technical operators in the railway domain and to define which topics should be addressed by the training of these skills. Observational interviews were conducted in order to identify the main tasks and the organization of the work of technical operators as well as the technical systems used for the execution of their job. Based on this analysis, the most demanding tasks of technical operators could be identified and described. The cognitive skills involved in the execution of these tasks are those, which need to be trained. In order to identify and analyze these cognitive skills a cognitive task analysis (CTA) was developed. CTA specifically aims at identifying the cognitive skills that employees implement when performing their own tasks. The identified cognitive skills of technical operators were summarized and grouped in training topics. For every training topic, specific goals were defined. The goals regard the three main categories; knowledge, skills and attitude to be trained in every training topic. Based on the results of this study, it is possible to develop specific training methods to train the relevant cognitive skills of the technical operators.

Keywords: cognitive skills, cognitive task analysis, technical operators in the railway domain, training topics

Procedia PDF Downloads 146
1213 When Conducting an Analysis of Workplace Incidents, It Is Imperative to Meticulously Calculate Both the Frequency and Severity of Injuries Sustain

Authors: Arash Yousefi

Abstract:

Experts suggest that relying exclusively on parameters to convey a situation or establish a condition may not be adequate. Assessing and appraising incidents in a system based on accident parameters, such as accident frequency, lost workdays, or fatalities, may not always be precise and occasionally erroneous. The frequency rate of accidents is a metric that assesses the correlation between the number of accidents causing work-time loss due to injuries and the total working hours of personnel over a year. Traditionally, this has been calculated based on one million working hours, but the American Occupational Safety and Health Organization has updated its standards. The new coefficient of 200/000 working hours is now used to compute the frequency rate of accidents. It's crucial to ensure that the total working hours of employees are equally represented when calculating individual event and incident numbers. The accident severity rate is a metric used to determine the amount of time lost or wasted during a given period, often a year, in relation to the total number of working hours. It measures the percentage of work hours lost or wasted compared to the total number of useful working hours, which provides valuable insight into the number of days lost or wasted due to work-related incidents for each working hour. Calculating the severity of an incident can be difficult if a worker suffers permanent disability or death. To determine lost days, coefficients specified in the "tables of days equivalent to OSHA or ANSI standards" for disabling injuries are used. The accident frequency coefficient denotes the rate at which accidents occur, while the accident severity coefficient specifies the extent of damage and injury caused by these accidents. These coefficients are crucial in accurately assessing the magnitude and impact of accidents.

Keywords: incidents, safety, analysis, frequency, severity, injuries, determine

Procedia PDF Downloads 87
1212 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 197
1211 Design of EV Steering Unit Using AI Based on Estimate and Control Model

Authors: Seong Jun Yoon, Jasurbek Doliev, Sang Min Oh, Rodi Hartono, Kyoojae Shin

Abstract:

Electric power steering (EPS), which is commonly used in electric vehicles recently, is an electric-driven steering device for vehicles. Compared to hydraulic systems, EPS offers advantages such as simple system components, easy maintenance, and improved steering performance. However, because the EPS system is a nonlinear model, difficult problems arise in controller design. To address these, various machine learning and artificial intelligence approaches, notably artificial neural networks (ANN), have been applied. ANN can effectively determine relationships between inputs and outputs in a data-driven manner. This research explores two main areas: designing an EPS identifier using an ANN-based backpropagation (BP) algorithm and enhancing the EPS system controller with an ANN-based Levenberg-Marquardt (LM) algorithm. The proposed ANN-based BP algorithm shows superior performance and accuracy compared to linear transfer function estimators, while the LM algorithm offers better input angle reference tracking and faster response times than traditional PID controllers. Overall, the proposed ANN methods demonstrate significant promise in improving EPS system performance.

Keywords: ANN backpropagation modelling, electric power steering, transfer function estimator, electrical vehicle driving system

Procedia PDF Downloads 26
1210 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network

Authors: Nasrin Bakhshizadeh, Ashkan Forootan

Abstract:

A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.

Keywords: polyethylene, polymerization, density, melt index, neural network

Procedia PDF Downloads 141
1209 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 375
1208 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.

Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management

Procedia PDF Downloads 337
1207 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 139