Search results for: Model Order reduction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12221

Search results for: Model Order reduction

8261 A TFETI Domain Decompositon Solver for Von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening

Authors: Martin Cermak, Stanislav Sysala

Abstract:

In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MatLab.

Keywords: Isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
8260 Chitosan/Casein Microparticles: Preparation, Characterization and Drug Release Studies

Authors: Selvakumar Dhanasingh, Shunmuga Kumar Nallaperumal

Abstract:

Microparticles carrier systems made from naturally occurring polymers based on chitosan/casein system appears to be a promising carrier for the sustained release of orally and parenteral administered drugs. In the current study we followed a microencapsulation technique based aqueous coacervation method to prepare chitosan/casein microparticles of compositions 1:1, 1:2 and 1:5 incorporated with chloramphenicol. Glutaraldehyde was used as a chemical cross-linking agent. The microparticles were prepared by aerosol method and studied by optical microscopy, infrared spectroscopy, thermo gravimetric analysis, swelling studies and drug release studies at various pH. The percentage swelling of the polymers are found to be in the order pH 4 > pH 10 > pH 7 and the increase in casein composition decrease the swelling percentage. The drug release studies also follow the above order.

Keywords: Chitosan/casein micro particles, chloramphenicol, drug release, microencapsulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3200
8259 SOA-Based Mobile Application for Crime Control in Thailand

Authors: Jintana Khemprasit, Vatcharaporn Esichaikul

Abstract:

Crime is a major societal problem for most of the world's nations. Consequently, the police need to develop new methods to improve their efficiency in dealing with these ever increasing crime rates. Two of the common difficulties that the police face in crime control are crime investigation and the provision of crime information to the general public to help them protect themselves. Crime control in police operations involves the use of spatial data, crime data and the related crime data from different organizations (depending on the nature of the analysis to be made). These types of data are collected from several heterogeneous sources in different formats and from different platforms, resulting in a lack of standardization. Moreover, there is no standard framework for crime data collection, integration and dissemination through mobile devices. An investigation into the current situation in crime control was carried out to identify the needs to resolve these issues. This paper proposes and investigates the use of service oriented architecture (SOA) and the mobile spatial information service in crime control. SOA plays an important role in crime control as an appropriate way to support data exchange and model sharing from heterogeneous sources. Crime control also needs to facilitate mobile spatial information services in order to exchange, receive, share and release information based on location to mobile users anytime and anywhere.

Keywords: Crime Control, Geographic Information System (GIS), Mobile GIS, Service Oriented Architecture (SOA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2533
8258 Vibration Attenuation Using Functionally Graded Material

Authors: Saeed Asiri, Hassan Hedia, Wael Eissa

Abstract:

The aim of the work was to attenuate the vibration amplitude in CESNA 172 airplane wing by using Functionally Graded Material instead of uniform or composite material. Wing strength was achieved by means of stress analysis study, while wing vibration amplitudes and shapes were achieved by means of Modal and Harmonic analysis. Results were verified by applying the methodology in a simple cantilever plate to the simple model and the results were promising and the same methodology can be applied to the airplane wing model. Aluminum models, Titanium models, and functionally graded materials of Aluminum and titanium results were compared to show a great vibration attenuation after using the FGM. Optimization in FGM gradation satisfied our objective of reducing and attenuating the vibration amplitudes to show the effect of using FGM in vibration behavior. Testing the Aluminum rich models, and comparing it with the titanium rich model was an optimization in this paper. Results have shown a significant attenuation in vibration magnitudes when using FGM instead of Titanium Plate, and Aluminium wing with FGM Spurs instead of Aluminium wings. It was also recommended that in future, changing the graphical scale to 1:10 or even 1:1 when the computers- capabilities allow.

Keywords: Vibration, Attenuation, FGM, ANSYS2011, FEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3134
8257 Lattice Boltzmann Simulation of the Carbonization of Wood Particle

Authors: Ahmed Mahmoudi, Imen Mejri, Mohamed A. Abbassi, Ahmed Omri

Abstract:

A numerical study based on the Lattice Boltzmann Method (LBM) is proposed to solve one, two and three dimensional heat and mass transfer for isothermal carbonization of thick wood particles. To check the validity of the proposed model, computational results have been compared with the published data and a good agreement is obtained. Then, the model is used to study the effect of reactor temperature and thermal boundary conditions, on the evolution of the local temperature and the mass distributions of the wood particle during carbonization

Keywords: Lattice Boltzmann Method, pyrolysis conduction, carbonization, Heat and mass transfer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707
8256 Aspect-Level Sentiment Analysis with Multi-Channel and Graph Convolutional Networks

Authors: Jiajun Wang, Xiaoge Li

Abstract:

The purpose of the aspect-level sentiment analysis task is to identify the sentiment polarity of aspects in a sentence. Currently, most methods mainly focus on using neural networks and attention mechanisms to model the relationship between aspects and context, but they ignore the dependence of words in different ranges in the sentence, resulting in deviation when assigning relationship weight to other words other than aspect words. To solve these problems, we propose an aspect-level sentiment analysis model that combines a multi-channel convolutional network and graph convolutional network (GCN). Firstly, the context and the degree of association between words are characterized by Long Short-Term Memory (LSTM) and self-attention mechanism. Besides, a multi-channel convolutional network is used to extract the features of words in different ranges. Finally, a convolutional graph network is used to associate the node information of the dependency tree structure. We conduct experiments on four benchmark datasets. The experimental results are compared with those of other models, which shows that our model is better and more effective.

Keywords: Aspect-level sentiment analysis, attention, multi-channel convolution network, graph convolution network, dependency tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 506
8255 Geometrically Non-Linear Axisymmetric Free Vibration Analysis of Functionally Graded Annular Plates

Authors: Boutahar Lhoucine, El Bikri Khalid, Benamar Rhali

Abstract:

In this paper, the non-linear free axisymmetric vibration of a thin annular plate made of functionally graded material (FGM) has been studied by using the energy method and a multimode approach. FGM properties vary continuously as well as non-homogeneity through the thickness direction of the plate. The theoretical model is based on the classical plate theory and the Von Kármán geometrical non-linearity assumptions. An approximation has been adopted in the present work consisting of neglecting the in-plane deformation in the formulation. Hamilton’s principle is used to derive the governing equation of motion. The problem is solved by a numerical iterative procedure in order to obtain more accurate results for vibration amplitudes up to 1.5 times the plate thickness. The numerical results are given for the first axisymmetric non-linear mode shape for a wide range of vibration amplitudes and they are presented either in tabular form or in graphical form to show the effect that the vibration amplitude and the variation in material properties have significant effects on the frequencies and the bending stresses in large amplitude vibration of the functionally graded annular plate.

Keywords: Non-linear vibrations, Annular plates, Large amplitudes, FGM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181
8254 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: Graph computation, Graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 548
8253 Utilizing Dutch Auction in an Agent-based Model E-commerce System

Authors: Costin Badica, Maria Ganzha, Maciej Gawinecki, Pawel Kobzdej, Marcin Paprzycki

Abstract:

Recently, we have presented an initial implementation of a model agent-based e-commerce system, which utilized a simple price negotiation mechanism–English Auction. In this note we discuss how a Dutch Auction involving multiple units of a product can be included in our system. We present UML diagrams of agents involved in price negotiations and briefly discuss rule-based mechanism exemplifying Dutch Auction.

Keywords: e-commerce, rule-based price negotiation mechanism, Dutch Auction, agent system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741
8252 Dynamical Analysis of a Harvesting Model of Phytoplankton-Zooplankton Interaction

Authors: Anuj K. Sharma, Amit Sharma, Kulbhushan Agnihotri

Abstract:

In this work, we propose and analyze a model of Phytoplankton-Zooplankton interaction with harvesting considering that some species are exploited commercially for food. Criteria for local stability, instability and global stability are derived and some threshold harvesting levels are explored to maintain the population at an appropriate equilibrium level even if the species are exploited continuously.Further,biological and bionomic equilibria of the system are obtained and an optimal harvesting policy is also analysed using the Pantryagin’s Maximum Principle.Finally analytical findings are also supported by some numerical simulations.

Keywords: Phytoplankton-Zooplankton, Global stability, Bionomic Equilibrium, Pontrying-Maximum Principal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
8251 Evaluation of Dynamic Behavior a Machine Tool Spindle System through Modal and Unbalance Response Analysis

Authors: Khairul Jauhari, Achmad Widodo, Ismoyo Haryanto

Abstract:

The spindle system is one of the most important components of machine tool. The dynamic properties of the spindle affect the machining productivity and quality of the work pieces. Thus, it is important and necessary to determine its dynamic characteristics of spindles in the design and development in order to avoid forced resonance. The finite element method (FEM) has been adopted in order to obtain the dynamic behavior of spindle system. For this reason, obtaining the Campbell diagrams and determining the critical speeds are very useful to evaluate the spindle system dynamics. The unbalance response of the system to the center of mass unbalance at the cutting tool is also calculated to investigate the dynamic behavior. In this paper, we used an ANSYS Parametric Design Language (APDL) program which based on finite element method has been implemented to make the full dynamic analysis and evaluation of the results. Results show that the calculated critical speeds are far from the operating speed range of the spindle, thus, the spindle would not experience resonance, and the maximum unbalance response at operating speed is still with acceptable limit. ANSYS Parametric Design Language (APDL) can be used by spindle designer as tools in order to increase the product quality, reducing cost, and time consuming in the design and development stages.

Keywords: ANSYS parametric design language (APDL), Campbell diagram, Critical speeds, Unbalance response, The Spindle system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2830
8250 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), Steady State, Transient State.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2518
8249 Knowledge Acquisition and Client Organisations: Case Study of a Student as Producer

Authors: Barry Ardley, Abi Hunt, Nick Taylor

Abstract:

As a theoretical and practical framework this study uses the student as producer approach to learning in higher education, as adopted by the Lincoln International Business School, University of Lincoln, UK. Student as producer positions learners as skilled and capable agents, able to participate as partners with tutors in live research projects. To illuminate the nature of this approach to learning and to highlight its critical issues, the authors report on two guided student consultancy projects. These were set up with the assistance of two local organisations in the city of Lincoln UK. Using the student as producer model to deliver the projects enabled learners to acquire and develop a range of key skills and knowledge, not easily accessible in more traditional educational settings. This paper presents a systematic case study analysis of the eight organising principles of the student as producer model, as adopted by university tutors. The experience of tutors implementing student as producer suggests that the model can be widely applied to benefit not only the learning and teaching experiences of higher education students, and staff, but additionally, a university’s research programme and its community partners.

Keywords: Experiential learning, consultancy clients, student as producer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 242
8248 Semantically Enriched Web Usage Mining for Personalization

Authors: Suresh Shirgave, Prakash Kulkarni, José Borges

Abstract:

The continuous growth in the size of the World Wide Web has resulted in intricate Web sites, demanding enhanced user skills and more sophisticated tools to help the Web user to find the desired information. In order to make Web more user friendly, it is necessary to provide personalized services and recommendations to the Web user. For discovering interesting and frequent navigation patterns from Web server logs many Web usage mining techniques have been applied. The recommendation accuracy of usage based techniques can be improved by integrating Web site content and site structure in the personalization process.

Herein, we propose semantically enriched Web Usage Mining method for Personalization (SWUMP), an extension to solely usage based technique. This approach is a combination of the fields of Web Usage Mining and Semantic Web. In the proposed method, we envisage enriching the undirected graph derived from usage data with rich semantic information extracted from the Web pages and the Web site structure. The experimental results show that the SWUMP generates accurate recommendations and is able to achieve 10-20% better accuracy than the solely usage based model. The SWUMP addresses the new item problem inherent to solely usage based techniques.

Keywords: Prediction, Recommendation, Semantic Web Usage Mining, Web Usage Mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3023
8247 A New Suburb Renovation Concept

Authors: A. Soikkeli, L. Sorri

Abstract:

Finnish national research project, User- and Business-oriented Suburb Renovation Concept (KLIKK), was started in January 2012 and will end in June 2014. The perspective of energy efficiency is emphasised in the project, but also it addresses what improving the energy efficiency of suburban apartment buildings means from the standpoint of architecturally valuable buildings representing different periods. The project will also test the impacts of stricter energy efficiency requirements on renovation projects.

The primary goal of the project is to develop a user-oriented, industrial, economic renovation concept for suburban apartment building renovation, extension and construction of additional storeys. The concept will make it possible to change from performance- and cost-based operation to novel service- and user-oriented, site-specifically tailored renovation methods utilizing integrated order and delivery chains.

The present project is collaborating with Ministry of the Environment and participating cities in developing a new type of lighter town planning model for suburban renovations and in-fill construction. To support this, the project will simultaneously develop practices for environmental impact assessment tools in renovation and suburban supplementary and in-fill construction.

 

Keywords: Energy efficiency, Prefabrication, Renovation concept, Suburbs, Sustainability, User-Orientated.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037
8246 Development of an Autonomous Friction Gripper for Industrial Robots

Authors: Majid Tolouei-Rad, Peter Kalivitis

Abstract:

Industrial robots become useless without end-effectors that for many instances are in the form of friction grippers. Commonly friction grippers apply frictional forces to different objects on the basis of programmers- experiences. This puts a limitation on the effectiveness of gripping force that may result in damaging the object. This paper describes various stages of design and development of a low cost sensor-based robotic gripper that would facilitate the task of applying right gripping forces to different objects. The gripper is also equipped with range sensors in order to avoid collisions of the gripper with objects. It is a fully functional automated pick and place gripper which can be used in many industrial applications. Yet it can also be altered or further developed in order to suit a larger number of industrial activities. The current design of gripper could lead to designing completely automated robot grippers able to improve the efficiency and productivity of industrial robots.

Keywords: Control system, end-effector, robot, sensor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2892
8245 Multi-Agent Based Modeling Using Multi-Criteria Decision Analysis and OLAP System for Decision Support Problems

Authors: Omar Boutkhoum, Mohamed Hanine, Tarik Agouti, Abdessadek Tikniouine

Abstract:

This paper discusses the intake of combining multi-criteria decision analysis (MCDA) with OLAP systems, to generate an integrated analysis process dealing with complex multi-criteria decision-making situations. In this context, a multi-agent modeling is presented for decision support systems by combining multi-criteria decision analysis (MCDA) with OLAP systems. The proposed modeling which consists in performing the multi-agent system (MAS) architecture, procedure and protocol of the negotiation model is elaborated as a decision support tool for complex decision-making environments. Our objective is to take advantage from the multi-agent system which distributes resources and computational capabilities across interconnected agents, and provide a problem modeling in terms of autonomous interacting component-agents. Thus, the identification and evaluation of criteria as well as the evaluation and ranking of alternatives in a decision support situation will be performed by organizing tasks and user preferences between different agents in order to reach the right decision. At the end, an illustrative example is conducted to demonstrate the function and effectiveness of our MAS modeling.

Keywords: Multidimensional Analysis, OLAP Analysis, Multi-criteria Decision Analysis, Multi-Agent System, Decision Support System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
8244 Automatic Distance Compensation for Robust Voice-based Human-Computer Interaction

Authors: Randy Gomez, Keisuke Nakamura, Kazuhiro Nakadai

Abstract:

Distant-talking voice-based HCI system suffers from performance degradation due to mismatch between the acoustic speech (runtime) and the acoustic model (training). Mismatch is caused by the change in the power of the speech signal as observed at the microphones. This change is greatly influenced by the change in distance, affecting speech dynamics inside the room before reaching the microphones. Moreover, as the speech signal is reflected, its acoustical characteristic is also altered by the room properties. In general, power mismatch due to distance is a complex problem. This paper presents a novel approach in dealing with distance-induced mismatch by intelligently sensing instantaneous voice power variation and compensating model parameters. First, the distant-talking speech signal is processed through microphone array processing, and the corresponding distance information is extracted. Distance-sensitive Gaussian Mixture Models (GMMs), pre-trained to capture both speech power and room property are used to predict the optimal distance of the speech source. Consequently, pre-computed statistic priors corresponding to the optimal distance is selected to correct the statistics of the generic model which was frozen during training. Thus, model combinatorics are post-conditioned to match the power of instantaneous speech acoustics at runtime. This results to an improved likelihood in predicting the correct speech command at farther distances. We experiment using real data recorded inside two rooms. Experimental evaluation shows voice recognition performance using our method is more robust to the change in distance compared to the conventional approach. In our experiment, under the most acoustically challenging environment (i.e., Room 2: 2.5 meters), our method achieved 24.2% improvement in recognition performance against the best-performing conventional method.

Keywords: Human Machine Interaction, Human Computer Interaction, Voice Recognition, Acoustic Model Compensation, Acoustic Speech Enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
8243 Simultaneous Optimization of Machining Parameters and Tool Geometry Specifications in Turning Operation of AISI1045 Steel

Authors: Farhad Kolahan, Mohsen Manoochehri, Abbas Hosseini

Abstract:

Machining is an important manufacturing process used to produce a wide variety of metallic parts. Among various machining processes, turning is one of the most important one which is employed to shape cylindrical parts. In turning, the quality of finished product is measured in terms of surface roughness. In turn, surface quality is determined by machining parameters and tool geometry specifications. The main objective of this study is to simultaneously model and optimize machining parameters and tool geometry in order to improve the surface roughness for AISI1045 steel. Several levels of machining parameters and tool geometry specifications are considered as input parameters. The surface roughness is selected as process output measure of performance. A Taguchi approach is employed to gather experimental data. Then, based on signal-to-noise (S/N) ratio, the best sets of cutting parameters and tool geometry specifications have been determined. Using these parameters values, the surface roughness of AISI1045 steel parts may be minimized. Experimental results are provided to illustrate the effectiveness of the proposed approach.

Keywords: Taguchi method, turning parameters, tool geometry specifications, S/N ratio, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2325
8242 Photonic Crystal Waveguide 1x3 Flexible Power Splitter for Optical Network

Authors: Jyothi Digge, B. U. Rindhe, S. K. Narayankhedkar

Abstract:

A compact 1x3 power splitter based on Photonic Crystal Waveguides (PCW) with flexible power splitting ratio is presented in this paper. Multimode interference coupler (MMI) is integrated with PCW. The device size reduction compared with the conventional MMI power splitter is attributed to the large dispersion of the PCW. Band Solve tool is used to calculate the band structure of PCW. Finite Difference Time Domain (FDTD) method is adopted to simulate the relevant structure at 1550nm wavelength. The device is polarization insensitive and allows the control of output (o/p) powers within certain percentage points for both polarizations.

Keywords: Dispersion, MMI Coupler, Photonic Bandgap, Power Splitter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
8241 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory

Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock

Abstract:

Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.

Keywords: Subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
8240 Robust Batch Process Scheduling in Pharmaceutical Industries: A Case Study

Authors: Tommaso Adamo, Gianpaolo Ghiani, Antonio D. Grieco, Emanuela Guerriero

Abstract:

Batch production plants provide a wide range of scheduling problems. In pharmaceutical industries a batch process is usually described by a recipe, consisting of an ordering of tasks to produce the desired product. In this research work we focused on pharmaceutical production processes requiring the culture of a microorganism population (i.e. bacteria, yeasts or antibiotics). Several sources of uncertainty may influence the yield of the culture processes, including (i) low performance and quality of the cultured microorganism population or (ii) microbial contamination. For these reasons, robustness is a valuable property for the considered application context. In particular, a robust schedule will not collapse immediately when a cell of microorganisms has to be thrown away due to a microbial contamination. Indeed, a robust schedule should change locally in small proportions and the overall performance measure (i.e. makespan, lateness) should change a little if at all. In this research work we formulated a constraint programming optimization (COP) model for the robust planning of antibiotics production. We developed a discrete-time model with a multi-criteria objective, ordering the different criteria and performing a lexicographic optimization. A feasible solution of the proposed COP model is a schedule of a given set of tasks onto available resources. The schedule has to satisfy tasks precedence constraints, resource capacity constraints and time constraints. In particular time constraints model tasks duedates and resource availability time windows constraints. To improve the schedule robustness, we modeled the concept of (a, b) super-solutions, where (a, b) are input parameters of the COP model. An (a, b) super-solution is one in which if a variables (i.e. the completion times of a culture tasks) lose their values (i.e. cultures are contaminated), the solution can be repaired by assigning these variables values with a new values (i.e. the completion times of a backup culture tasks) and at most b other variables (i.e. delaying the completion of at most b other tasks). The efficiency and applicability of the proposed model is demonstrated by solving instances taken from a real-life pharmaceutical company. Computational results showed that the determined super-solutions are near-optimal.

Keywords: Constraint programming, super-solutions, robust scheduling, batch process, pharmaceutical industries.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975
8239 Relationships between Social Entrepreneurship, CSR and Social Innovation: In Theory and Practice

Authors: Krisztina Szegedi, Gyula Fülöp, Ádám Bereczk

Abstract:

The shared goal of social entrepreneurship, corporate social responsibility and social innovation is the advancement of society. The business model of social enterprises is characterized by unique strategies based on the competencies of the entrepreneurs, and is not aimed primarily at the maximization of profits, but rather at carrying out goals for the benefit of society. Corporate social responsibility refers to the active behavior of a company, by which it can create new solutions to meet the needs of society, either on its own or in cooperation with other social stakeholders. The objectives of this article are to define concepts, describe and integrate relevant theoretical models, develop a model and introduce some examples of international practice that can inspire initiatives for social development.

Keywords: Corporate social responsibility, CSR, social innovation, social entrepreneurship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4147
8238 A Visual Control Flow Language and Its Termination Properties

Authors: László Lengyel, Tihamér Levendovszky, Hassan Charaf

Abstract:

This paper presents the visual control flow support of Visual Modeling and Transformation System (VMTS), which facilitates composing complex model transformations out of simple transformation steps and executing them. The VMTS Visual Control Flow Language (VCFL) uses stereotyped activity diagrams to specify control flow structures and OCL constraints to choose between different control flow branches. This work discusses the termination properties of VCFL and provides an algorithm to support the termination analysis of VCFL transformations.

Keywords: Control Flow, Metamodel-Based Visual Model Transformation, OCL, Termination Properties, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2066
8237 Radiowave Propagation in Picocellular Environment Using 2.5D Ray Tracing Technique

Authors: Fathi Alwafie

Abstract:

This paper presents a ray tracing simulation technique for characterize the radiowave propagation inside building. The implementation of an algorithm capable of enumerating a large number of propagation paths in interactive time for the special case of 2.5D. The effective dielectric constants of the building structure in the simulations are indicated. The study describes an efficient 2.5D model of ray tracing algorithm were compared with 3D model. The result of the first investigations is that the environment of the indoor wave significantly changes as we change the electric parameters of material constructions. A detailed analysis of the dependence of the indoor wave on the wideband characteristics of the channel: root mean square (RMS) delay spread characteristics and Mean excess delay, is also investigated.

Keywords: Picrocellular, Propagation, Ray tracing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
8236 Effect of Delay on Supply Side on Market Behavior: A System Dynamic Approach

Authors: M. Khoshab, M. J. Sedigh

Abstract:

Dynamic systems, which in mathematical point of view are those governed by differential equations, are much more difficult to study and to predict their behavior in comparison with static systems which are governed by algebraic equations. Economical systems such as market are among complicated dynamic systems. This paper tries to adopt a very simple mathematical model for market and to study effect of supply and demand function on behavior of the market while the supply side experiences a lag due to production restrictions.

Keywords: Dynamic System, Lag on Supply Demand, Market Stability, Supply Demand Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
8235 Detection of Ultrasonic Images in the Presence of a Random Number of Scatterers: A Statistical Learning Approach

Authors: J. P. Dubois, O. M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.

Keywords: LS-SVM, medical ultrasound imaging, partially developed speckle, multi-look model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1341
8234 SVM-Based Detection of SAR Images in Partially Developed Speckle Noise

Authors: J. P. Dubois, O. M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of SAR (synthetic aperture radar) images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to real SAR images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected SAR images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (the detection hypotheses) in the original images.

Keywords: Least Square-Support Vector Machine, SyntheticAperture Radar. Partially Developed Speckle, Multi-Look Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
8233 Developing Improvements to Multi-Hazard Risk Assessments

Authors: A. Fathianpour, M. B. Jelodar, S. Wilkinson

Abstract:

This paper outlines the approaches taken to assess multi-hazard assessments. There is currently confusion in assessing multi-hazard impacts, and so this study aims to determine which of the available options are the most useful. The paper uses an international literature search, and analysis of current multi-hazard assessments and a case study to illustrate the effectiveness of the chosen method. Findings from this study will help those wanting to assess multi-hazards to undertake a straightforward approach. The paper is significant as it helps to interpret the various approaches and concludes with the preferred method. Many people in the world live in hazardous environments and are susceptible to disasters. Unfortunately, when a disaster strikes it is often compounded by additional cascading hazards, thus people would confront more than one hazard simultaneously. Hazards include natural hazards (earthquakes, floods, etc.) or cascading human-made hazards (for example, Natural Hazard Triggering Technological disasters (Natech) such as fire, explosion, toxic release). Multi-hazards have a more destructive impact on urban areas than one hazard alone. In addition, climate change is creating links between different disasters such as causing landslide dams and debris flows leading to more destructive incidents. Much of the prevailing literature deals with only one hazard at a time. However, recently sophisticated multi-hazard assessments have started to appear. Given that multi-hazards occur, it is essential to take multi-hazard risk assessment under consideration. This paper aims to review the multi-hazard assessment methods through articles published to date and categorize the strengths and disadvantages of using these methods in risk assessment. Napier City is selected as a case study to demonstrate the necessity of using multi-hazard risk assessments. In order to assess multi-hazard risk assessments, first, the current multi-hazard risk assessment methods were described. Next, the drawbacks of these multi-hazard risk assessments were outlined. Finally, the improvements to current multi-hazard risk assessments to date were summarised. Generally, the main problem of multi-hazard risk assessment is to make a valid assumption of risk from the interactions of different hazards. Currently, risk assessment studies have started to assess multi-hazard situations, but drawbacks such as uncertainty and lack of data show the necessity for more precise risk assessment. It should be noted that ignoring or partial considering multi-hazards in risk assessment will lead to an overestimate or overlook in resilient and recovery action managements.

Keywords: Cascading hazards, multi-hazard, risk assessment, risk reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1096
8232 Electrodeposited Silver Nanostructures: A Non-Enzymatic Sensor for Hydrogen Peroxide

Authors: Mandana Amiri, Sima Nouhi, Yashar Azizan-Kalandaragh

Abstract:

Silver nanostructures have been successfully fabricated by using electrodeposition method onto indium-tin-oxide (ITO) substrate. Scanning electron microscopy (SEM), electrochemical impedance spectroscopy (EIS) and ultraviolet-visible spectroscopy (UV-Vis) techniques were employed for characterization of silver nanostructures. The results show nanostructures with different morphology and electrochemical properties can be obtained by various the deposition potentials and times. Electrochemical behavior of the nanostructures has been studied by using cyclic voltammetry. Silver nanostructures exhibits good electrocatalytic activity towards the reduction of H2O2. The presented electrode can be employed as sensing element for hydrogen peroxide.

Keywords: Electrochemical sensor, electrodeposition, hydrogen peroxide, silver nanostructures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1108