Search results for: input shaping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2641

Search results for: input shaping

1111 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images

Authors: Shahriar Farzam, Maryam Rastgarpour

Abstract:

Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).

Keywords: curvelet transform, CBCT, image enhancement, image denoising

Procedia PDF Downloads 288
1110 Implant Operation Guiding Device for Dental Surgeons

Authors: Daniel Hyun

Abstract:

Dental implants are one of the top 3 reasons to sue a dentist for malpractice. It involves dental implant complications, usually because of the angle of the implant from the surgery. At present, surgeons usually use a 3D-printed navigator that is customized for the patient’s teeth. However, those can’t be reused for other patients as they require time. Therefore, I made a guiding device to assist the surgeon in implant operations. The surgeon can input the objective of the operation, and the device constantly checks if the surgery is heading towards the objective within the set range, telling the surgeon by manipulating the LED. We tested the prototypes’ consistency and accuracy by checking the graph, average standard deviation, and the average change of the calculated angles. The accuracy of performance was also acquired by running the device and checking the outputs. My first prototype used accelerometer and gyroscope sensors from the Arduino MPU6050 sensor, getting a changeable graph, achieving 0.0295 of standard deviations, 0.25 of average change, and 66.6% accuracy of performance. The second prototype used only the gyroscope, and it got a constant graph, achieved 0.0062 of standard deviation, 0.075 of average change, and 100% accuracy of performance, indicating that the accelerometer sensor aggravated the functionality of the device. Using the gyroscope sensor allowed it to measure the orientations of separate axes without affecting each other and also increased the stability and accuracy of the measurements.

Keywords: implant, guide, accelerometer, gyroscope, handpiece

Procedia PDF Downloads 31
1109 Human Action Recognition Using Variational Bayesian HMM with Dirichlet Process Mixture of Gaussian Wishart Emission Model

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we present the human action recognition method using the variational Bayesian HMM with the Dirichlet process mixture (DPM) of the Gaussian-Wishart emission model (GWEM). First, we define the Bayesian HMM based on the Dirichlet process, which allows an infinite number of Gaussian-Wishart components to support continuous emission observations. Second, we have considered an efficient variational Bayesian inference method that can be applied to drive the posterior distribution of hidden variables and model parameters for the proposed model based on training data. And then we have derived the predictive distribution that may be used to classify new action. Third, the paper proposes a process of extracting appropriate spatial-temporal feature vectors that can be used to recognize a wide range of human behaviors from input video image. Finally, we have conducted experiments that can evaluate the performance of the proposed method. The experimental results show that the method presented is more efficient with human action recognition than existing methods.

Keywords: human action recognition, Bayesian HMM, Dirichlet process mixture model, Gaussian-Wishart emission model, Variational Bayesian inference, prior distribution and approximate posterior distribution, KTH dataset

Procedia PDF Downloads 344
1108 Interpretable Deep Learning Models for Medical Condition Identification

Authors: Dongping Fang, Lian Duan, Xiaojing Yuan, Mike Xu, Allyn Klunder, Kevin Tan, Suiting Cao, Yeqing Ji

Abstract:

Accurate prediction of a medical condition with straight clinical evidence is a long-sought topic in the medical management and health insurance field. Although great progress has been made with machine learning algorithms, the medical community is still, to a certain degree, suspicious about the model's accuracy and interpretability. This paper presents an innovative hierarchical attention deep learning model to achieve good prediction and clear interpretability that can be easily understood by medical professionals. This deep learning model uses a hierarchical attention structure that matches naturally with the medical history data structure and reflects the member’s encounter (date of service) sequence. The model attention structure consists of 3 levels: (1) attention on the medical code types (diagnosis codes, procedure codes, lab test results, and prescription drugs), (2) attention on the sequential medical encounters within a type, (3) attention on the medical codes within an encounter and type. This model is applied to predict the occurrence of stage 3 chronic kidney disease (CKD3), using three years’ medical history of Medicare Advantage (MA) members from a top health insurance company. The model takes members’ medical events, both claims and electronic medical record (EMR) data, as input, makes a prediction of CKD3 and calculates the contribution from individual events to the predicted outcome. The model outcome can be easily explained with the clinical evidence identified by the model algorithm. Here are examples: Member A had 36 medical encounters in the past three years: multiple office visits, lab tests and medications. The model predicts member A has a high risk of CKD3 with the following well-contributed clinical events - multiple high ‘Creatinine in Serum or Plasma’ tests and multiple low kidneys functioning ‘Glomerular filtration rate’ tests. Among the abnormal lab tests, more recent results contributed more to the prediction. The model also indicates regular office visits, no abnormal findings of medical examinations, and taking proper medications decreased the CKD3 risk. Member B had 104 medical encounters in the past 3 years and was predicted to have a low risk of CKD3, because the model didn’t identify diagnoses, procedures, or medications related to kidney disease, and many lab test results, including ‘Glomerular filtration rate’ were within the normal range. The model accurately predicts members A and B and provides interpretable clinical evidence that is validated by clinicians. Without extra effort, the interpretation is generated directly from the model and presented together with the occurrence date. Our model uses the medical data in its most raw format without any further data aggregation, transformation, or mapping. This greatly simplifies the data preparation process, mitigates the chance for error and eliminates post-modeling work needed for traditional model explanation. To our knowledge, this is the first paper on an interpretable deep-learning model using a 3-level attention structure, sourcing both EMR and claim data, including all 4 types of medical data, on the entire Medicare population of a big insurance company, and more importantly, directly generating model interpretation to support user decision. In the future, we plan to enrich the model input by adding patients’ demographics and information from free-texted physician notes.

Keywords: deep learning, interpretability, attention, big data, medical conditions

Procedia PDF Downloads 85
1107 Fire Characteristic of Commercial Retardant Flame Polycarbonate under Different Oxygen Concentration: Ignition Time and Heat Blockage

Authors: Xuelin Zhang, Shouxiang Lu, Changhai Li

Abstract:

The commercial retardant flame polycarbonate samples as the main high speed train interior carriage material with different thicknesses were investigated in Fire Propagation Apparatus with different external heat fluxes under different oxygen concentration from 12% to 40% to study the fire characteristics and quantitatively analyze the ignition time, mass loss rate and heat blockage. The additives of commercial retardant flame polycarbonate were intumescent and maintained a steady height before ignition when heated. The results showed the transformed ignition time (1/t_ig)ⁿ increased linearly with external flux under different oxygen concentration after deducting the heat blockage due to pyrolysis products, the mass loss rate was taken on linearly with external heat fluxes and the slop of the fitting line for mass loss rate and external heat fluxes decreased with the enhanced oxygen concentration and the heat blockage independent on external heat fluxes rose with oxygen concentration increasing. The inquired data as the input of the fire simulation model was the most important to be used to evaluate the fire risk of commercial retardant flame polycarbonate.

Keywords: ignition time, mass loss rate, heat blockage, fire characteristic

Procedia PDF Downloads 277
1106 Educational Experiences in Engineering in the COVID Era and Their Comparative Analysis, Spain, March to June 2020

Authors: Borja Bordel, Ramón Alcarria, Marina Pérez

Abstract:

In March 2020, in Spain, a sanitary and unexpected crisis caused by COVID-19 was declared. All of a sudden, all degrees, classes and evaluation tests and projects had to be transformed into online activities. However, the chaotic situation generated by a complex operation like that, executed without any well-established procedure, led to very different experiences and, finally, results. In this paper, we are describing three experiences in two different Universities in Madrid. On the one hand, the Technical University of Madrid, a public university with little experience in online education. On the other hand, Alfonso X el Sabio University, a private university with more than five years of experience in online teaching. All analyzed subjects were related to computer engineering. Professors and students answered a survey and personal interviews were also carried out. Besides, the professors’ workload and the students’ academic results were also compared. From the comparative analysis of all these experiences, we are extracting the most successful strategies, methodologies, and activities. The recommendations in this paper will be useful for courses during the next months when the sanitary situation is still affecting an educational organization. While, at the same time, they will be considered as input for the upcoming digitalization process of higher education.

Keywords: educational experience, online education, higher education digitalization, COVID, Spain

Procedia PDF Downloads 130
1105 Sustainable Business Model Archetypes – A Systematic Review and Application to the Plastic Industry

Authors: Felix Schumann, Giorgia Carratta, Tobias Dauth, Liv Jaeckel

Abstract:

In the last few decades, the rapid growth of the use and disposal of plastic items has led to their overaccumulation in the environment. As a result, plastic pollution has become a subject of global concern. Today plastics are used as raw materials in almost every industry. While the recognition of the ecological, social, and economic impact of plastics in academic research is on the rise, the potential role of the ‘plastic industry’ in dealing with such issues is still largely underestimated. Therefore, the literature on sustainable plastic management is still nascent and fragmented. Working towards sustainability requires a fundamental shift in the way companies employ plastics in their day-to-day business. For that reason, the applicability of the business model concept has recently gained momentum in environmental research. Business model innovation is increasingly recognized as an important driver to re-conceptualize the purpose of the firm and to readily integrate sustainability in their business. It can serve as a starting point to investigate whether and how sustainability can be realized under industry- and firm-specific circumstances. Yet, there is no comprehensive view in the plastic industry on how firms start refining their business models to embed sustainability in their operations. Our study addresses this gap, looking primarily at the industrial sectors responsible for the production of the largest amount of plastic waste today: plastic packaging, consumer goods, construction, textile, and transport. Relying on the archetypes of sustainable business models and applying them to the aforementioned sectors, we try to identify companies’ current strategies to make their business models more sustainable. Based on the thematic clustering, we can develop an integrative framework for the plastic industry. The findings are underpinned and illustrated by a variety of relevant plastic management solutions that the authors have identified through a systematic literature review and analysis of existing, empirically grounded research in this field. Using the archetypes, we can promote options for business model innovations for the most important sectors in which plastics are used. Moreover, by linking the proposed business model archetypes to the plastic industry, our research approach guides firms in exploring sustainable business opportunities. Likewise, researchers and policymakers can utilize our classification to identify best practices. The authors believe that the study advances the current knowledge on sustainable plastic management through its broad empirical industry analyses. Hence, the application of business model archetypes in the plastic industry will be useful for shaping companies’ transformation to create and deliver more sustainability and provides avenues for future research endeavors.

Keywords: business models, environmental economics, plastic management, plastic pollution, sustainability

Procedia PDF Downloads 86
1104 New Recipes of Communication in the New Linguistic World Order: End of Road for Aged Pragmatics

Authors: Shailendra Kumar Singh

Abstract:

With the rise of New Linguistic World Order in the 21st century, the Aged Pragmatics is palpitating on the edge of theoretical irrelevance. What appears to be a new sociolinguistic reality is that the enlightening combination of alternative west, inclusive globalization and techno-revolution is adding novel recipes to communicative actions, style and gain among new linguistic breed which is being neither dominated nor powered by the western supremacy. The paper has the following main, interrelated, aims: it is intended to introduce the concept of alternative pragmatics that can offer what exactly is needed for our emerging societal realities; it asserts as to how the basic pillar of linguistic success in the new linguistic world order rests upon linguistic temptation and calibration of all; and it also reviews an inevitability of emerging economies in shaping the communication trends at a time when the western world is struggling to maintain the same control on the others exercised in the past. In particular, the paper seeks answers for the following questions: (a) Do we need an alternative pragmatics, one with alternativist leaning in an era of inclusive globalization and alternative west? (b) What are the pulses of shift which are encapsulating emergence of new communicative behavior among the new linguistic breed by breaking yesterday’s linguistic rigidity? (c) Or, what are those shifts which are making linguistic shift more perceptible? (d) Is New Linguistic World Order succeeding in reversing linguistic priorities of `who speaks, what language, where, how, why, to whom and in which condition’ with no parallel in the history? (e) What is explicit about the contemporary world of 21st century which makes linguistic world all exciting and widely celebrative phenomenon and that is also forced into our vision? (f) What factors will hold key to the future of yesterday’s `influential languages’ and today’s `emerging languages’ as world is in the paradigm transition? (g) Is the collapse of Aged Pragmatics good for the 21st century for understanding the difference between pragmatism of old linguistic world and new linguistic world order? New Linguistic world Order today, unlike in the past, is about a branding of new world with liberal world view for a particular form of ideal to be imagined in the 21st century. At this time without question it is hope that a new set of ideals with popular vocabulary will become the implicit pragmatic model as one of benign majoritarianism in all aspects of sociolinguistic reality. It appears to be a reality that we live in an extraordinary linguistic world with no parallel in the past. In particular, the paper also highlights the paradigm shifts: Demographic, Social-psychological, technological and power. These shifts are impacting linguistic shift which is unique in itself. The paper will highlight linguistic shift in details in which alternative west plays a major role without challenging the west because it is an era of inclusive globalization in which almost everyone takes equal responsibility.

Keywords: inclusive globalization, new linguistic world order, linguistic shift, world order

Procedia PDF Downloads 337
1103 Collaborative Governance in Dutch Flood Risk Management: An Historical Analysis

Authors: Emma Avoyan

Abstract:

The safety standards for flood protection in the Netherlands have been revised recently. It is expected that all major flood-protection structures will have to be reinforced to meet the new standards. The Dutch Flood Protection Programme aims at accomplishing this task through innovative integrated projects such as construction of multi-functional flood defenses. In these projects, flood safety purposes will be combined with spatial planning, nature development, emergency management or other sectoral objectives. Therefore, implementation of dike reinforcement projects requires early involvement and collaboration between public and private sectors, different governmental actors and agencies. The development and implementation of such integrated projects has been an issue in Dutch flood risk management since long. Therefore, this article analyses how cross-sector collaboration within flood risk governance in the Netherlands has evolved over time, and how this development can be explained. The integrative framework for collaborative governance is applied as an analytical tool to map external factors framing possibilities as well as constraints for cross-sector collaboration in Dutch flood risk domain. Supported by an extensive document and literature analysis, the paper offers insights on how the system context and different drivers changing over time either promoted or hindered cross-sector collaboration between flood protection sector, urban development, nature conservation or any other sector involved in flood risk governance. The system context refers to the multi-layered and interrelated suite of conditions that influence the formation and performance of complex governance systems, such as collaborative governance regimes, whereas the drivers initiate and enable the overall process of collaboration. In addition, by applying a method of process tracing we identify a causal and chronological chain of events shaping cross-sectoral interaction in Dutch flood risk management. Our results indicate that in order to evaluate the performance of complex governance systems, it is important to firstly study the system context that shapes it. Clear understanding of the system conditions and drivers for collaboration gives insight into the possibilities of and constraints for effective performance of complex governance systems. The performance of the governance system is affected by the system conditions, while at the same time the governance system can also change the system conditions. Our results show that the sequence of changes within the system conditions and drivers over time affect how cross-sector interaction in Dutch flood risk governance system happens now. Moreover, we have traced the potential of this governance system to shape and change the system context.

Keywords: collaborative governance, cross-sector interaction, flood risk management, the Netherlands

Procedia PDF Downloads 121
1102 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida

Authors: N. Mehrotra, V. Ahuja, N. Sridharan

Abstract:

Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.

Keywords: disaster, resilience, system, urban

Procedia PDF Downloads 451
1101 Theoretical Analysis of the Solid State and Optical Characteristics of Calcium Sulpide Thin Film

Authors: Emmanuel Ifeanyi Ugwu

Abstract:

Calcium Sulphide which is one of Chalcogenide group of thin films has been analyzed in this work using a theoretical approach in which a scalar wave was propagated through the material thin film medium deposited on a glass substrate with the assumption that the dielectric medium has homogenous reference dielectric constant term, and a perturbed dielectric function, representing the deposited thin film medium on the surface of the glass substrate as represented in this work. These were substituted into a defined scalar wave equation that was solved first of all by transforming it into Volterra equation of second type and solved using the method of separation of variable on scalar wave and subsequently, Green’s function technique was introduced to obtain a model equation of wave propagating through the thin film that was invariably used in computing the propagated field, for different input wavelengths representing UV, Visible and Near-infrared regions of field considering the influence of the dielectric constants of the thin film on the propagating field. The results obtained were used in turn to compute the band gaps, solid state and optical properties of the thin film.

Keywords: scalar wave, dielectric constant, calcium sulphide, solid state, optical properties

Procedia PDF Downloads 99
1100 Stock Market Prediction Using Convolutional Neural Network That Learns from a Graph

Authors: Mo-Se Lee, Cheol-Hwi Ahn, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN (Convolutional Neural Network), which is known as effective solution for recognizing and classifying images, has been popularly applied to classification and prediction problems in various fields. In this study, we try to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. In specific, we propose to apply CNN as the binary classifier that predicts stock market direction (up or down) by using a graph as its input. That is, our proposal is to build a machine learning algorithm that mimics a person who looks at the graph and predicts whether the trend will go up or down. Our proposed model consists of four steps. In the first step, it divides the dataset into 5 days, 10 days, 15 days, and 20 days. And then, it creates graphs for each interval in step 2. In the next step, CNN classifiers are trained using the graphs generated in the previous step. In step 4, it optimizes the hyper parameters of the trained model by using the validation dataset. To validate our model, we will apply it to the prediction of KOSPI200 for 1,986 days in eight years (from 2009 to 2016). The experimental dataset will include 14 technical indicators such as CCI, Momentum, ROC and daily closing price of KOSPI200 of Korean stock market.

Keywords: convolutional neural network, deep learning, Korean stock market, stock market prediction

Procedia PDF Downloads 421
1099 Law and its Implementation and Consequences in Pakistan

Authors: Amir Shafiq, Asif Shahzad, Shabbar Mehmood, Muhammad Saeed, Hamid Mustafa

Abstract:

Legislation includes the law or the statutes which is being reputable by a sovereign authority and generally can be implemented by the courts of law time to time to accomplish the objectives. Historically speaking upon the emergence of Pakistan in 1947, the intact laws of the British Raj remained effective after ablution by Islamic Ideology. Thus, there was an intention to begin the statutes book afresh for Pakistan's legal history. In consequence thereof, the process of developing detailed plans, procedures and mechanisms to ensure legislative and regulatory requirements are achieved began keeping in view the cultural values and the local customs. This article is an input to the enduring discussion about implementing rule of law in Pakistan whereas; the rule of law requires the harmony of laws which is mostly in the arrangement of codified state laws. Pakistan has legal plural civilizations where completely different and independent systems of law like the Mohammadan law, the state law and the traditional law exist. The prevailing practiced law in Pakistan is actually the traditional law though the said law is not acknowledged by the State. This caused the main problem of the rule of law in the difference between the state laws and the cultural values. These values, customs and so-called traditional laws are the main obstacle to enforce the State law in true letter and spirit which has caused dissatisfaction of the masses and distrust upon the judicial system of the country.

Keywords: consequences, implement, law, Pakistan

Procedia PDF Downloads 423
1098 Chemical Fabrication of Gold Nanorings: Controlled Reduction and Optical Tuning for Nanomedicine Applications

Authors: Mehrnaz Mostafavi, Jalaledin Ghanavi

Abstract:

This research investigates the production of nanoring structures through a chemical reduction approach, exploring gradual reduction processes assisted by reductant agents, leading to the formation of these specialized nanorings. The study focuses on the controlled reduction of metal atoms within these agents, crucial for shaping these nanoring structures over time. The paper commences by highlighting the wide-ranging applications of metal nanostructures across fields like Nanomedicine, Nanobiotechnology, and advanced spectroscopy methods such as Surface Enhanced Raman Spectroscopy (SERS) and Surface Enhanced Infrared Absorption Spectroscopy (SEIRA). Particularly, gold nanoparticles, especially in the nanoring configuration, have gained significant attention due to their distinctive properties, offering accessible spaces suitable for sensing and spectroscopic applications. The methodology involves utilizing human serum albumin as a reducing agent to create gold nanoparticles through a chemical reduction process. This process involves the transfer of electrons from albumin's carboxylic groups, converting them into carbonyl, while AuCl4− acquires electrons to form gold nanoparticles. Various characterization techniques like Ultraviolet–visible spectroscopy (UV-Vis), Atomic-force microscopy (AFM), and Transmission electron microscopy (TEM) were employed to examine and validate the creation and properties of the gold nanoparticles and nanorings. The findings suggest that precise and gradual reduction processes, in conjunction with optimal pH conditions, play a pivotal role in generating nanoring structures. Experiments manipulating optical properties revealed distinct responses in the visible and infrared spectrums, demonstrating the tunability of these nanorings. Detailed examinations of the morphology confirmed the formation of gold nanorings, elucidating their size, distribution, and structural characteristics. These nanorings, characterized by an empty volume enclosed by uniform walls, exhibit promising potential in the realms of Nanomedicine and Nanobiotechnology. In summary, this study presents a chemical synthesis approach using organic reducing agents to produce gold nanorings. The results underscore the significance of controlled and gradual reduction processes in crafting nanoring structures with unique optical traits, offering considerable value across diverse nanotechnological applications.

Keywords: nanoring structures, chemical reduction approach, gold nanoparticles, spectroscopy methods, nano medicine applications

Procedia PDF Downloads 105
1097 Estimations of Spectral Dependence of Tropospheric Aerosol Single Scattering Albedo in Sukhothai, Thailand

Authors: Siriluk Ruangrungrote

Abstract:

Analyses of available data from MFR-7 measurement were performed and discussed on the study of tropospheric aerosol and its consequence in Thailand. Since, ASSA (w) is one of the most important parameters for a determination of aerosol effect on radioactive forcing. Here the estimation of w was directly determined in terms of the ratio of aerosol scattering optical depth to aerosol extinction optical depth (ωscat/ωext) without any utilization of aerosol computer code models. This is of benefit for providing the elimination of uncertainty causing by the modeling assumptions and the estimation of actual aerosol input data. Diurnal w of 5 cloudless-days in winter and early summer at 5 distinct wavelengths of 415, 500, 615, 673 and 870 nm with the consideration of Rayleigh scattering and atmospheric column NO2 and Ozone contents were investigated, respectively. Besides, the tendency of spectral dependence of ω representing two seasons was observed. The characteristic of spectral results reveals that during wintertime the atmosphere of the inland rural vicinity for the period of measurement possibly dominated with a lesser amount of soil dust aerosols loading than one in early summer. Hence, the major aerosol loading particularly in summer was subject to a mixture of both soil dust and biomass burning aerosols.

Keywords: aerosol scattering optical depth, aerosol extinction optical depth, biomass burning aerosol, soil dust aerosol

Procedia PDF Downloads 399
1096 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 439
1095 Always Keep in Control: The Pattern of TV Policy Changes in China

Authors: Shan Jiang

Abstract:

China is a country with a distinct cultural system. The Chinese Communist Party (CCP) is the central factor for everything, which naturally includes culture. There are quite a lot of cultural policies in China. The same goes for TV dramas. This paper traces the evolution of Chinese TV drama policy since 1986, examines the realistic situation behind the changes, and explores the structure and role of the government in shaping the process. Using historical documents and media reports, it first analyzes four key time nodes: 1986, 2003, 2012, and 2022. It shows how the policy shifts from restricting private production to opening up to public participation, from imposing one censorship to another, and from promoting some content to restricting some other area. It finds that the policy process is not simply rectilinear but rather wandering between deregulation and strengthening control. Secondly, it divides the policies into "basic" policies that establish the overall layout and more refined "strategic" policies that respond to more refined needs. It argues that the "basic" policy process is caused by China's political, economic, and cultural system reform, and then the "strategic" policy process is affected by more environmental factors, such as the government's follow-up development strategy, industrial development, technological innovation, and specific situations. Thirdly, it analysis the main body of the 104 policies from 2000 to 2021 and puts these subjects into China's power structure and cultural system, revealing that the policy issuers are all under the highest leadership of the Chinese Central Committee. Further, the paper challenges the typical description of Chinese cultural policy, which focuses on state control exclusively, identifies the forces within and outside the system that participate in or affect the policy-making process, and reveals the inter-subjective mechanism of policy change. In conclusion, the paper reveals that China's TV drama policy is under the unified leadership of the Party and the government, which greatly guarantees the consistency of the overall direction of cultural policy, that is, the right to speak firmly in the hands. The forces within the system can sometimes promote policy changes due to common development needs. However, folk discourse is only the object of control: when it breeds a certain amount of industrial space, the government will strengthen control over this space, suppress its potential "adverse effects", and instead provide protection and create conditions for the cultivation and growth of its mainstream discourse. However, the policy combination of basic policy and strategic policy, while having a strong effect and emergency capacity, also inhibits the innovation and diversification of the TV drama market. However, the state's substantial regulation will continue to exist in the future.

Keywords: TV Policy, China, policy process, cultural policy, culture management

Procedia PDF Downloads 82
1094 Bacteriological Safety of Sachet Drinking Water Sold in Benin City, Nigeria

Authors: Stephen Olusanmi Akintayo

Abstract:

Access to safe drinking water remains a major challenge in Nigeria, and where available, the quality of the water is often in doubt. An alternative to the inadequate clean drinking water is being found in treated drinking water packaged in electrically heated sealed nylon and commonly referred to as “sachet water”. “Sachet water” is a common thing in Nigeria as the selling price is within the reach of members of the low socio- economic class and the setting up of a production unit does not require huge capital input. The bacteriological quality of selected “sachet water” stored at room temperature over a period of 56 days was determined to evaluate the safety of the sachet drinking water. Test for the detection of coliform bacteria was performed, and the result showed no coliform bacteria that indicates the absence of fecal contamination throughout 56 days. Heterotrophic plate count (HPC) was done at an interval 14 days, and the samples showed HPC between 0 cfu/mL and 64 cfu/mL. The highest count was observed on day 1. The count decreased between day 1 and 28, while no growths were observed between day 42 and 56. The decrease in HPC suggested the presence of residual disinfectant in the water. The organisms isolated were identified as Staphylococcus epidermis and S. aureus. The presence of these microorganisms in sachet water is indicative for contamination during processing and handling.

Keywords: coliform, heterotrophic plate count, sachet water, Staphyloccocus aureus, Staphyloccocus epidermidis

Procedia PDF Downloads 331
1093 The Use of Building Energy Simulation Software in Case Studies: A Literature Review

Authors: Arman Ameen, Mathias Cehlin

Abstract:

The use of Building Energy Simulation (BES) software has increased in the last two decades, parallel to the development of increased computing power and easy to use software applications. This type of software is primarily used to simulate the energy use and the indoor environment for a building. The rapid development of these types of software has raised their level of user-friendliness, better parameter input options and the increased possibility of analysis, both for a single building component or an entire building. This, in turn, has led to many researchers utilizing BES software in their research in various degrees. The aim of this paper is to carry out a literature review concerning the use of the BES software IDA Indoor Climate and Energy (IDA ICE) in the scientific community. The focus of this paper will be specifically the use of the software for whole building energy simulation, number and types of articles and publications dates, the area of application, types of parameters used, the location of the studied building, type of building, type of analysis and solution methodology. Another aspect that is examined, which is of great interest, is the method of validations regarding the simulation results. The results show that there is an upgoing trend in the use of IDA ICE and that researchers use the software in their research in various degrees depending on case and aim of their research. The satisfactory level of validation of the simulations carried out in these articles varies depending on the type of article and type of analysis.

Keywords: building simulation, IDA ICE, literature review, validation

Procedia PDF Downloads 126
1092 Reformulation of Theory of Critical Distances to Predict the Strength of Notched Plain Concrete Beams under Quasi Static Loading

Authors: Radhika V., J. M. Chandra Kishen

Abstract:

The theory of critical distances (TCD), due to its appealing characteristics, has been successfully used in the past to predict the strength of brittle as well as ductile materials, weakened by the presence of stress risers under both static and fatigue loading. By utilising most of the TCD's unique features, this paper summarises an attempt for a reformulation of the point method of the TCD to predict the strength of notched plain concrete beams under mode I quasi-static loading. A zone of micro cracks, which is responsible for the non-linearity of concrete, is taken into account considering the concept of an effective elastic crack. An attempt is also made to correlate the value of the material characteristic length required for the application of TCD with the maximum aggregate size in the concrete mix, eliminating the need for any extensive experimentation prior to the application of TCD. The devised reformulation and the proposed power law based relationship is found to yield satisfactory predictions for static strength of notched plain concrete beams, with geometric dimensions of the beam, tensile strength, and maximum aggregate size of the concrete mix being the only needed input parameters.

Keywords: characteristic length, effective elastic crack, inherent material strength, modeI loading, theory of critical distances

Procedia PDF Downloads 93
1091 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 80
1090 Encouraging Collaboration and Innovation: The New Engineering Oriented Educational Reform in Urban Planning, Tianjin University, China

Authors: Tianjie Zhang, Bingqian Cheng, Peng Zeng

Abstract:

Engineering science and technology progress and innovation have become an important engine to promote social development. The reform exploration of "new engineering" in China has drawn extensive attention around the world, with its connotation as "to cultivate future diversified, innovative and outstanding engineering talents by taking ‘fostering character and civic virtue’ as the guide, responding to changes and shaping the future as the construction concept, and inheritance and innovation, crossover and fusion, coordination and sharing as the principal approach". In this context, Tianjin University, as a traditional Chinese university with advantages in engineering, further launched the CCII (Coherent-Collaborative-Interdisciplinary-Innovation) program, raising the cultivation idea of integrating new liberal arts education, multidisciplinary engineering education and personalized professional education. As urban planning practice in China has undergone the evolution of "physical planning -- comprehensive strategic planning -- resource management-oriented planning", planning education has also experienced the transmutation process of "building foundation -- urban scientific foundation -- multi-disciplinary integration". As a characteristic and advantageous discipline of Tianjin University, the major of Urban and Rural Planning, in accordance with the "CCII Program of Tianjin University", aims to build China's top and world-class major, and implements the following educational reform measures: 1. Adding corresponding English courses, such as advanced course on GIS Analysis, courses on comparative studies in international planning involving ecological resources and the sociology of the humanities, etc. 2. Holding "Academician Forum", inviting international academicians to give lectures or seminars to track international frontier scientific research issues. 3. Organizing "International Joint Workshop" to provide students with international exchange and design practice platform. 4. Setting up a business practice base, so that students can find problems from practice and solve them in an innovative way. Through these measures, the Urban and Rural Planning major of Tianjin University has formed a talent training system with multi-disciplinary cross integration and orienting to the future science and technology.

Keywords: China, higher education reform, innovation, new engineering education, rural and urban planning, Tianjin University

Procedia PDF Downloads 113
1089 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 194
1088 Design of EV Steering Unit Using AI Based on Estimate and Control Model

Authors: Seong Jun Yoon, Jasurbek Doliev, Sang Min Oh, Rodi Hartono, Kyoojae Shin

Abstract:

Electric power steering (EPS), which is commonly used in electric vehicles recently, is an electric-driven steering device for vehicles. Compared to hydraulic systems, EPS offers advantages such as simple system components, easy maintenance, and improved steering performance. However, because the EPS system is a nonlinear model, difficult problems arise in controller design. To address these, various machine learning and artificial intelligence approaches, notably artificial neural networks (ANN), have been applied. ANN can effectively determine relationships between inputs and outputs in a data-driven manner. This research explores two main areas: designing an EPS identifier using an ANN-based backpropagation (BP) algorithm and enhancing the EPS system controller with an ANN-based Levenberg-Marquardt (LM) algorithm. The proposed ANN-based BP algorithm shows superior performance and accuracy compared to linear transfer function estimators, while the LM algorithm offers better input angle reference tracking and faster response times than traditional PID controllers. Overall, the proposed ANN methods demonstrate significant promise in improving EPS system performance.

Keywords: ANN backpropagation modelling, electric power steering, transfer function estimator, electrical vehicle driving system

Procedia PDF Downloads 15
1087 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network

Authors: Nasrin Bakhshizadeh, Ashkan Forootan

Abstract:

A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.

Keywords: polyethylene, polymerization, density, melt index, neural network

Procedia PDF Downloads 135
1086 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 372
1085 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.

Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management

Procedia PDF Downloads 332
1084 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 137
1083 CTHTC: A Convolution-Backed Transformer Architecture for Temporal Knowledge Graph Embedding with Periodicity Recognition

Authors: Xinyuan Chen, Mohd Nizam Husen, Zhongmei Zhou, Gongde Guo, Wei Gao

Abstract:

Temporal Knowledge Graph Completion (TKGC) has attracted increasing attention for its enormous value; however, existing models lack capabilities to capture both local interactions and global dependencies simultaneously with evolutionary dynamics, while the latest achievements in convolutions and Transformers haven't been employed in this area. What’s more, periodic patterns in TKGs haven’t been fully explored either. To this end, a multi-stage hybrid architecture with convolution-backed Transformers is introduced in TKGC tasks for the first time combining the Hawkes process to model evolving event sequences in a continuous-time domain. In addition, the seasonal-trend decomposition is adopted to identify periodic patterns. Experiments on six public datasets are conducted to verify model effectiveness against state-of-the-art (SOTA) methods. An extensive ablation study is carried out accordingly to evaluate architecture variants as well as the contributions of independent components in addition, paving the way for further potential exploitation. Besides complexity analysis, input sensitivity and safety challenges are also thoroughly discussed for comprehensiveness with novel methods.

Keywords: temporal knowledge graph completion, convolution, transformer, Hawkes process, periodicity

Procedia PDF Downloads 66
1082 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods

Authors: Bandar Alahmadi, Lethia Jackson

Abstract:

Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.

Keywords: adversarial examples, attack, computer vision, image processing

Procedia PDF Downloads 329