Search results for: proportional representation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1670

Search results for: proportional representation

860 Check Red Blood Cells Concentrations of a Blood Sample by Using Photoconductive Antenna

Authors: Ahmed Banda, Alaa Maghrabi, Aiman Fakieh

Abstract:

Terahertz (THz) range lies in the area between 0.1 to 10 THz. The process of generating and detecting THz can be done through different techniques. One of the most familiar techniques is done through a photoconductive antenna (PCA). The process of generating THz radiation at PCA includes applying a laser pump in femtosecond and DC voltage difference. However, photocurrent is generated at PCA, which its value is affected by different parameters (e.g., dielectric properties, DC voltage difference and incident power of laser pump). THz radiation is used for biomedical applications. However, different biomedical fields need new technologies to meet patients’ needs (e.g. blood-related conditions). In this work, a novel method to check the red blood cells (RBCs) concentration of a blood sample using PCA is presented. RBCs constitute 44% of total blood volume. RBCs contain Hemoglobin that transfers oxygen from lungs to body organs. Then it returns to the lungs carrying carbon dioxide, which the body then gets rid of in the process of exhalation. The configuration has been simulated and optimized using COMSOL Multiphysics. The differentiation of RBCs concentration affects its dielectric properties (e.g., the relative permittivity of RBCs in the blood sample). However, the effects of four blood samples (with different concentrations of RBCs) on photocurrent value have been tested. Photocurrent peak value and RBCs concentration are inversely proportional to each other due to the change of dielectric properties of RBCs. It was noticed that photocurrent peak value has dropped from 162.99 nA to 108.66 nA when RBCs concentration has risen from 0% to 100% of a blood sample. The optimization of this method helps to launch new products for diagnosing blood-related conditions (e.g., anemia and leukemia). The resultant electric field from DC components can not be used to count the RBCs of the blood sample.

Keywords: biomedical applications, photoconductive antenna, photocurrent, red blood cells, THz radiation

Procedia PDF Downloads 193
859 Fast Robust Switching Control Scheme for PWR-Type Nuclear Power Plants

Authors: Piyush V. Surjagade, Jiamei Deng, Paul Doney, S. R. Shimjith, A. John Arul

Abstract:

In sophisticated and complex systems such as nuclear power plants, maintaining the system's stability in the presence of uncertainties and disturbances and obtaining a fast dynamic response are the most challenging problems. Thus, to ensure the satisfactory and safe operation of nuclear power plants, this work proposes a new fast, robust optimal switching control strategy for pressurized water reactor-type nuclear power plants. The proposed control strategy guarantees a substantial degree of robustness, fast dynamic response over the entire operational envelope, and optimal performance during the nominal operation of the plant. To improve the robustness, obtain a fast dynamic response, and make the system optimal, a bank of controllers is designed. Various controllers, like a baseline proportional-integral-derivative controller, an optimal linear quadratic Gaussian controller, and a robust adaptive L1 controller, are designed to perform distinct tasks in a specific situation. At any instant of time, the most suitable controller from the bank of controllers is selected using the switching logic unit that designates the controller by monitoring the health of the nuclear power plant or transients. The proposed switching control strategy optimizes the overall performance and increases operational safety and efficiency. Simulation studies have been performed considering various uncertainties and disturbances that demonstrate the applicability and effectiveness of the proposed switching control strategy over some conventional control techniques.

Keywords: switching control, robust control, optimal control, nuclear power control

Procedia PDF Downloads 115
858 Biodegradable and Bioactive Scaffold for Bone Tissue Engineering

Authors: A. M. Malagon Escandon, J. A. Arenas Alatorre, C. P. Chaires Rosas, N. A. Vazquez Torres, B. Hernandez Tellez, G. Pinon Zarate, M. Herrera Enriquez, A. E. Castell Rodriguez

Abstract:

The current approach to the treatment of bone defects involves the use of scaffolds that provide a biological and mechanically stable niche to favor tissue repair. Despite the significant progress in the field of bone tissue engineering, several main problems associated are attributed to giving a low biodegradation degree, does not promote osseointegration and regeneration, if the bone is not healing as well as expected or fails to heal, will not be given a proper ossification or new bone formation. The actual approaches of bone tissue regeneration are directed to the use of decellularized native extracellular matrices, which are able of retain their own architecture, mechanic properties, biodegradability and promote new bone formation because they are capable of conserving proteins and other factors that are founded in physiological concentrations. Therefore, we propose an extracellular matrix-based bioscaffolds derived from bovine cancellous bone, which is processed by decellularization, demineralization, and hydrolysis of the collagen protein, these protocols have been successfully carried out in other organs and tissues; the effectiveness of its biosafety has also been previously evaluated in vivo and Food and Drug Administration (FDA) approved. In the specific case of bone, a more complex treatment is needed in comparison with other organs and tissues because is necessary demineralization and collagen denaturalization. The present work was made in order to obtain a temporal scaffold that succeed in degradation in an inversely proportional way to the synthesis of extracellular matrix and the maturation of the bone by the cells of the host.

Keywords: bioactive, biodegradable, bone, extracellular matrix-based bioscaffolds, stem cells, tissue engineering

Procedia PDF Downloads 144
857 Performance-Based Quality Evaluation of Database Conceptual Schemas

Authors: Janusz Getta, Zhaoxi Pan

Abstract:

Performance-based quality evaluation of database conceptual schemas is an important aspect of database design process. It is evident that different conceptual schemas provide different logical schemas and performance of user applications strongly depends on logical and physical database structures. This work presents the entire process of performance-based quality evaluation of conceptual schemas. First, we show format. Then, the paper proposes a new specification of object algebra for representation of conceptual level database applications. Transformation of conceptual schemas and expression of object algebra into implementation schema and implementation in a particular database system allows for precise estimation of the processing costs of database applications and as a consequence for precise evaluation of performance-based quality of conceptual schemas. Then we describe an experiment as a proof of concept for the evaluation procedure presented in the paper.

Keywords: conceptual schema, implementation schema, logical schema, object algebra, performance evaluation, query processing

Procedia PDF Downloads 280
856 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 146
855 Preparation vADL.net: A Software Architecture Tool with Support to All of Architectural Concepts Title

Authors: Adel Smeda, Badr Najep

Abstract:

Software architecture is a method of describing the architecture of a software system at a high level of abstraction. It represents a common abstraction of a system that stakeholders can use as a basis for mutual understanding, negotiation, consensus, and communication. It also manifests the earliest design decisions about a system, and these early bindings carry weight far out of proportion to their individual gravity with respect to the system's remaining development, its deployment, and its maintenance life, therefore it is the earliest point at which design decisions governing the system to be built can be analyzed. In this paper, we present a tool to model the architecture of software systems. It represents the first method by which system defects can be detected, and provide a clear representation of a system’s components and interactions at a high level of abstraction. It can be distinguished from other tools by its support to all software architecture elements. The tool is built using VB.net 2010. We used this tool to describe two well know systems, i.e. Capitalize and Client/Server, and the descriptions we obtained support all architectural elements of the two systems.

Keywords: software architecture, architecture description languages, modeling

Procedia PDF Downloads 457
854 Analysing Social Media Coverage of Political Speeches in Relation to Discourse and Context

Authors: Yaser Mohammed Altameemi

Abstract:

This research looks at the representation of the social media for the Saudi Government decrees regarding the developmental projects of the Saudi 2030 vision. The paper analyses a television interview with the Crown Prince Mohammed Bin Salman who talks about the progress of the Saudi vision of 2030, and how the government had acted as response to the COVID-19 pandemic. The interview was on 28/4/2021. The paper analyses the tweets on Twitter that cover the interview for the purpose of investigating the development of concepts and meanings regarding the Saudi peoples’ orientations towards the Saudi projects. The data include all related tweets from the day of the interview and the following seven days after the interview. The finding of the collocation analysis suggests that nationalism notion is explicitly expressed by users in Twitter. The main finding of this paper suggests the importance of further analyses for the concordance lines. However, the collocation network suggests that there is a clear highlight for nationalism.

Keywords: social media, twitter, political interview, prince Mohammed Bin Salman, Saudi vision 2030

Procedia PDF Downloads 179
853 Labour Productivity Measurement and Control Standards for Hotels

Authors: Kristine Joy Simpao

Abstract:

Improving labour productivity is one of the most enthralling and challenging aspects of managing hotels and restaurant business. The demand to secure countless productivity became an increasingly pivotal role of managers to survive and sustain the business. Besides making business profitable, they are in the doom to make every resource to become productive and effective towards achieving company goal while maximizing the value of organization. This paper examines what productivity means to the services industry, in particular, to the hotel industry. This is underpinned by an investigation of the extent of practice of respondent hotels to the labour productivity aspect in the areas of materials management, human resource management and leadership management and in a way, computing the labour productivity ratios using the hotel simple ratios of productivity in order to find a suitable measurement and control standards for hotels with SBMA, Olongapo City as the locale of the study. The finding shows that hotels labour productivity ratings are not perfect with some practices that are far below particularly on strategic and operational decisions in improving performance and productivity of its human resources. It further proves of the no significant difference ratings among the respondent’s type in all areas which indicated that they are having similar perception of the weak implementation of some of the indicators in the labour productivity practices. Furthermore, the results in the computation of labour productivity efficiency ratios resulted relationship of employees versus labour productivity practices are inversely proportional. This study provides a potential measurement and control standards for the enhancement of hotels labour productivity. These standards should also contain labour productivity customized for standard hotels in Subic Bay Freeport Zone to assist hotel owners in increasing the labour productivity while meeting company goals and objectives effectively.

Keywords: labour productivity, hotel, measurement and control, standards, efficiency ratios, practices

Procedia PDF Downloads 305
852 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function

Procedia PDF Downloads 297
851 Studying the Establishment of Knowledge Management Background Factors at Islamic Azad University, Behshahr Branch

Authors: Mohammad Reza Bagherzadeh, Mohammad Hossein Taheri

Abstract:

Knowledge management serves as one of the great breakthroughs in information and knowledge era and given its outstanding features, successful organizations tends to adopt it. Therefore, to deal with knowledge management establishment in universities is of special importance. In this regard, the present research aims to shed lights on factors background knowledge management establishment at Islamic Azad University, Behshahr Branch (Northern Iran). Considering three factors information technology system, knowledge process system and organizational culture as a fundamental of knowledge management infrastructure, foregoing factors were evaluated individually. The present research was conducted in descriptive-survey manner and participants included all staffs and faculty members, so that according to Krejcie & Morgan table a sample size proportional to the population size was considered. The measurement tools included survey questionnaire whose reliability was calculated to 0.83 according to Cronbachs alpha. To data analysis, descriptive statistics such as frequency and its percentage tables, column charts, mean, standard deviation and as for inferential statistics Kolomogrov- Smirnov test and single T-test were used. The findings show that despite the good corporate culture as one of the three factors background the establishment of the knowledge management at Islamic Azad University Behshahr Branch, other two ones, including IT systems, and knowledge processes systems are characterized with adverse status. As a result, these factors have caused no necessary conditions for the establishment of Knowledge Management in the university provided.

Keywords: knowledge management, information technology, knowledge processes, organizational culture, educational institutions

Procedia PDF Downloads 506
850 Threshold Concepts in TESOL: A Thematic Analysis of Disciplinary Guiding Principles

Authors: Neil Morgan

Abstract:

The notion of Threshold Concepts has offered a fertile new perspective on the transformative effects of mastery of particular concepts on student understanding of subject matter and their developing identities as inductees into disciplinary discourse communities. Only by successfully traversing key knowledge thresholds, it is claimed, can neophytes gain access to the more sophisticated understandings of subject matter possessed by mature members of a discipline. This paper uses thematic analysis of disciplinary guiding principles to identify nine candidate Threshold Concepts that appear to underpin effective TESOL practice. The relationship between these candidate TESOL Threshold Concepts, TESOL principles, and TESOL instructional techniques appears to be amenable to a schematic representation based on superordinate categories of TESOL practitioner concern and, as such, offers an alternative to the view of Threshold Concepts as a privileged subset of disciplinary core concepts. The paper concludes by exploring the potential of a Threshold Concepts framework to productively inform TESOL initial teacher education (ITE) and in-service education and training (INSET).

Keywords: TESOL, threshold concepts, TESOL principles, TESOL ITE/INSET, community of practice

Procedia PDF Downloads 133
849 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 344
848 Digital Musical Organology: The Audio Games: The Question of “A-Musicological” Interfaces

Authors: Hervé Zénouda

Abstract:

This article seeks to shed light on an emerging creative field: "Audio games," at the crossroads between video games and computer music. Indeed, many applications, which propose entertaining audio-visual experiences with the objective of musical creation, are available today for different supports (game consoles, computers, cell phones). The originality of this field is the use of the gameplay of video games applied to music composition. Thus, composing music using interfaces but also cognitive logics that we qualify as "a-musicological" seem to us particularly interesting from the perspective of musical digital organology. This field raises questions about the representation of sound and musical structures and develops new instrumental gestures and strategies of musical composition. We will try in this article to define the characteristics of this field by highlighting some historical milestones (abstract cinema, game theory in music, actions, and graphic scores) as well as the novelties brought by digital technologies.

Keywords: audio-games, video games, computer generated music, gameplay, interactivity, synesthesia, sound interfaces, relationships image/sound, audiovisual music

Procedia PDF Downloads 100
847 Effect of High-Energy Ball Milling on the Electrical and Piezoelectric Properties of (K0.5Na0.5)(Nb0.9Ta0.1)O3 Lead-Free Piezoceramics

Authors: Chongtham Jiten, K. Chandramani Singh, Radhapiyari Laishram

Abstract:

Nanocrystalline powders of the lead-free piezoelectric material, tantalum-substituted potassium sodium niobate (K0.5Na0.5)(Nb0.9Ta0.1)O3 (KNNT), were produced using a Retsch PM100 planetary ball mill by setting the milling time to 15h, 20h, 25h, 30h, 35h and 40h, at a fixed speed of 250rpm. The average particle size of the milled powders was found to decrease from 12nm to 3nm as the milling time increases from 15h to 25h, which is in agreement with the existing theoretical model. An anomalous increase to 98nm and then a drop to 3nm in the particle size were observed as the milling time further increases to 30h and 40h respectively. Various sizes of these starting KNNT powders were used to investigate the effect of milling time on the microstructure, dielectric properties, phase transitions and piezoelectric properties of the resulting KNNT ceramics. The particle size of starting KNNT was somewhat proportional to the grain size. As the milling time increases from 15h to 25h, the resulting ceramics exhibit enhancement in the values of relative density from 94.8% to 95.8%, room temperature dielectric constant (εRT) from 878 to 1213, and piezoelectric charge coefficient (d33) from 108pC/N to 128pC/N. For this range of ceramic samples, grain size refinement suppresses the maximum dielectric constant (εmax), shifts the Curie temperature (Tc) to a lower temperature and the orthorhombic-tetragonal phase transition (Tot) to a higher temperature. Further increase of milling time from 25h to 40h produces a gradual degradation in the values of relative density, εRT, and d33 of the resulting ceramics.

Keywords: perovskite, dielectric, ceramics, high-energy milling

Procedia PDF Downloads 308
846 The Impact of Gamification on Self-Assessment for English Language Learners in Saudi Arabia

Authors: Wala A. Bagunaid, Maram Meccawy, Arwa Allinjawi, Zilal Meccawy

Abstract:

Continuous self-assessment becomes crucial in self-paced online learning environments. Students often depend on themselves to assess their progress; which is considered an essential requirement for any successful learning process. Today’s education institutions face major problems around student motivation and engagement. Thus, personalized e-learning systems aim to help and guide the students. Gamification provides an opportunity to help students for self-assessment and social comparison with other students through attempting to harness the motivational power of games and apply it to the learning environment. Furthermore, Open Social Student Modeling (OSSM) as considered as the latest user modeling technologies is believed to improve students’ self-assessment and to allow them to social comparison with other students. This research integrates OSSM approach and gamification concepts in order to provide self-assessment for English language learners at King Abdulaziz University (KAU). This is achieved through an interactive visual representation of their learning progress.

Keywords: e-learning system, gamification, motivation, social comparison, visualization

Procedia PDF Downloads 138
845 FISCEAPP: FIsh Skin Color Evaluation APPlication

Authors: J. Urban, Á. S. Botella, L. E. Robaina, A. Bárta, P. Souček, P. Císař, Š. Papáček, L. M. Domínguez

Abstract:

Skin coloration in fish is of great physiological, behavioral and ecological importance and can be considered as an index of animal welfare in aquaculture as well as an important quality factor in the retail value. Currently, in order to compare color in animals fed on different diets, biochemical analysis, and colorimetry of fished, mildly anesthetized or dead body, are very accurate and meaningful measurements. The noninvasive method using digital images of the fish body was developed as a standalone application. This application deals with the computation burden and memory consumption of large input files, optimizing piece wise processing and analysis with the memory/computation time ratio. For the comparison of color distributions of various experiments and different color spaces (RGB, CIE L*a*b*) the comparable semi-equidistant binning of multi channels representation is introduced. It is derived from the knowledge of quantization levels and Freedman-Diaconis rule. The color calibrations and camera responsivity function were necessary part of the measurement process.

Keywords: color distribution, fish skin color, piecewise transformation, object to background segmentation

Procedia PDF Downloads 249
844 Distributed Perceptually Important Point Identification for Time Series Data Mining

Authors: Tak-Chung Fu, Ying-Kit Hung, Fu-Lai Chung

Abstract:

In the field of time series data mining, the concept of the Perceptually Important Point (PIP) identification process is first introduced in 2001. This process originally works for financial time series pattern matching and it is then found suitable for time series dimensionality reduction and representation. Its strength is on preserving the overall shape of the time series by identifying the salient points in it. With the rise of Big Data, time series data contributes a major proportion, especially on the data which generates by sensors in the Internet of Things (IoT) environment. According to the nature of PIP identification and the successful cases, it is worth to further explore the opportunity to apply PIP in time series ‘Big Data’. However, the performance of PIP identification is always considered as the limitation when dealing with ‘Big’ time series data. In this paper, two distributed versions of PIP identification based on the Specialized Binary (SB) Tree are proposed. The proposed approaches solve the bottleneck when running the PIP identification process in a standalone computer. Improvement in term of speed is obtained by the distributed versions.

Keywords: distributed computing, performance analysis, Perceptually Important Point identification, time series data mining

Procedia PDF Downloads 420
843 Governance Question and the Participatory Policy Making: Making the Process Functional in Nigeria

Authors: Albert T. Akume, P. D. Dahida

Abstract:

This paper examines the effect of various epochs of governments on policy making in Nigeria. The character of governance and public policy making of both epochs was exclusive, non-participatory and self-centric. As a consequence the interests of citizenry were not represented, neither protected nor sought to meet fairly the needs of all groups. The introduction of the post-1999 democratic government demand that the hitherto skewed pattern of policy making cease to be a character of governance. Hence, the need for citizen participation in the policy making process. The question then is what mode is most appropriate to engender public participation so as to make the policy making process functional? Given the prevailing social, economic and political dilemmas the utilization of the direct mode of citizen participation to affect policy outcome is doubtful if not unattainable. It is due to these predicament that this paper uses the documentary research design argues for the utilization of the indirect mode of citizen participation in the policy making process so as to affect public policy outcome appropriately and with less cost, acrimony and delays.

Keywords: governance, public policy, participation, representation, civil society

Procedia PDF Downloads 361
842 Provenance in Scholarly Publications: Introducing the provCite Ontology

Authors: Maria Joseph Israel, Ahmed Amer

Abstract:

Our work aims to broaden the application of provenance technology beyond its traditional domains of scientific workflow management and database systems by offering a general provenance framework to capture richer and extensible metadata in unstructured textual data sources such as literary texts, commentaries, translations, and digital humanities. Specifically, we demonstrate the feasibility of capturing and representing expressive provenance metadata, including more of the context for citing scholarly works (e.g., the authors’ explicit or inferred intentions at the time of developing his/her research content for publication), while also supporting subsequent augmentation with similar additional metadata (by third parties, be they human or automated). To better capture the nature and types of possible citations, in our proposed provenance scheme metaScribe, we extend standard provenance conceptual models to form our proposed provCite ontology. This provides a conceptual framework which can accurately capture and describe more of the functional and rhetorical properties of a citation than can be achieved with any current models.

Keywords: knowledge representation, provenance architecture, ontology, metadata, bibliographic citation, semantic web annotation

Procedia PDF Downloads 105
841 Awareness on Department of Education’s Disaster Risk Reduction Management Program at Oriental Mindoro National High School: Basis for Support School DRRM Program

Authors: Nimrod Bantigue

Abstract:

The Department of Education is continuously providing safe teaching-learning facilities and hazard-free environments to the learners. To achieve this goal, teachers’ awareness of DepEd’s DRRM programs and activities is extremely important; thus, this descriptive correlational quantitative study was conceptualized. This research answered four questions on the profile and level of awareness of the 153 teacher respondents of Oriental Mindoro National High School for the academic year 2018-2019. Stratified proportional sampling was employed, and both descriptive and inferential statistics were utilized to treat data. The findings revealed that the majority of the teachers at OMNHS are female and are in the age bracket of 20-40. Most are married and pursue graduate studies. They have moderate awareness of the Department of Education’s DRRM programs and activities in terms of assessment of risks activities, planning activities, implementation activities during disaster and evaluation and monitoring activities with 3.32, 3.12, 3.40 and 3.31 as computed means, respectively. Further, the result showed a significant relationship between the profile of the respondents such as age, civil status and educational attainment and the level of awareness. On the contrary, sex does not have a significant relationship with the level of awareness. The Support School DRRM program with Utilization Guide on School DRRM Manual was proposed to increase, improve and strengthen the weakest areas of awareness rated in each DRRM activity, such as assessment of risks, planning, and implementation during disasters and monitoring and evaluation.

Keywords: awareness, management, monitoring, risk reduction

Procedia PDF Downloads 203
840 Dynamic Ad-hoc Topologies for Mobile Robot Navigation Based on Non-Uniform Grid Maps

Authors: Peter Sauer, Thomas Hinze, Petra Hofstedt

Abstract:

To avoid obstacles in the surrounding environment and to navigate to a given target belong to the most important tasks for mobile robots. According to these tasks different data structures are suitable. To avoid near obstacles, occupancy grid maps are an ideal representation of the surroundings. For less fine grained tasks, such as navigating from one room to another in an apartment, pure grid maps are inappropriate. Grid maps are very detailed, calculating paths to navigate between rooms based on grid maps would take too long. Instead, graph-based data structures, so-called topologies, turn out to be a proper choice for such tasks. In this paper we present two methods to dynamically create topologies from grid maps. Both methods are based on non-uniform grid maps. The topologies are generated on-the-fly and can easily be modified to represent changes in the environment. This allows a hybrid approach to control mobile robots, where, depending on the situation and the current task, either the grid map or the generated topology may be used.

Keywords: robot navigation, occupancy grids, topological maps, dynamic map creation

Procedia PDF Downloads 552
839 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents

Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei

Abstract:

With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.

Keywords: document processing, framework, formal definition, machine learning

Procedia PDF Downloads 201
838 Suppressing Vibration in a Three-axis Flexible Satellite: An Approach with Composite Control

Authors: Jalal Eddine Benmansour, Khouane Boulanoir, Nacera Bekhadda, Elhassen Benfriha

Abstract:

This paper introduces a novel composite control approach that addresses the challenge of stabilizing the three-axis attitude of a flexible satellite in the presence of vibrations caused by flexible appendages. The key contribution of this research lies in the development of a disturbance observer, which effectively observes and estimates the unwanted torques induced by the vibrations. By utilizing the estimated disturbance, the proposed approach enables efficient compensation for the detrimental effects of vibrations on the satellite system. To govern the attitude angles of the spacecraft, a proportional derivative controller (PD) is specifically designed and proposed. The PD controller ensures precise control over all attitude angles, facilitating stable and accurate spacecraft maneuvering. In order to demonstrate the global stability of the system, the Lyapunov method, a well-established technique in control theory, is employed. Through rigorous analysis, the Lyapunov method verifies the convergence of system dynamics, providing strong evidence of system stability. To evaluate the performance and efficacy of the proposed control algorithm, extensive simulations are conducted. The simulation results validate the effectiveness of the combined approach, showcasing significant improvements in the stabilization and control of the satellite's attitude, even in the presence of disruptive vibrations from flexible appendages. This novel composite control approach presented in this paper contributes to the advancement of satellite attitude control techniques, offering a promising solution for achieving enhanced stability and precision in challenging operational environments.

Keywords: attitude control, flexible satellite, vibration control, disturbance observer

Procedia PDF Downloads 75
837 Finding Optimal Operation Condition in a Biological Nutrient Removal Process with Balancing Effluent Quality, Economic Cost and GHG Emissions

Authors: Seungchul Lee, Minjeong Kim, Iman Janghorban Esfahani, Jeong Tai Kim, ChangKyoo Yoo

Abstract:

It is hard to maintain the effluent quality of the wastewater treatment plants (WWTPs) under with fixed types of operational control because of continuously changed influent flow rate and pollutant load. The aims of this study is development of multi-loop multi-objective control (ML-MOC) strategy in plant-wide scope targeting four objectives: 1) maximization of nutrient removal efficiency, 2) minimization of operational cost, 3) maximization of CH4 production in anaerobic digestion (AD) for CH4 reuse as a heat source and energy source, and 4) minimization of N2O gas emission to cope with global warming. First, benchmark simulation mode is modified to describe N2O dynamic in biological process, namely benchmark simulation model for greenhouse gases (BSM2G). Then, three types of single-loop proportional-integral (PI) controllers for DO controller, NO3 controller, and CH4 controller are implemented. Their optimal set-points of the controllers are found by using multi-objective genetic algorithm (MOGA). Finally, multi loop-MOC in BSM2G is implemented and evaluated in BSM2G. Compared with the reference case, the ML-MOC with the optimal set-points showed best control performances than references with improved performances of 34%, 5% and 79% of effluent quality, CH4 productivity, and N2O emission respectively, with the decrease of 65% in operational cost.

Keywords: Benchmark simulation model for greenhouse gas, multi-loop multi-objective controller, multi-objective genetic algorithm, wastewater treatment plant

Procedia PDF Downloads 492
836 Polymeric Microspheres for Bone Tissue Engineering

Authors: Yamina Boukari, Nashiru Billa, Andrew Morris, Stephen Doughty, Kevin Shakesheff

Abstract:

Poly (lactic-co-glycolic) acid (PLGA) is a synthetic polymer that can be used in bone tissue engineering with the aim of creating a scaffold in order to support the growth of cells. The formation of microspheres from this polymer is an attractive strategy that would allow for the development of an injectable system, hence avoiding invasive surgical procedures. The aim of this study was to develop a microsphere delivery system for use as an injectable scaffold in bone tissue engineering and evaluate various formulation parameters on its properties. Porous and lysozyme-containing PLGA microspheres were prepared using the double emulsion solvent evaporation method from various molecular weights (MW). Scaffolds were formed by sintering to contain 1 -3mg of lysozyme per gram of scaffold. The mechanical and physical properties of the scaffolds were assessed along with the release of lysozyme, which was used as a model protein. The MW of PLGA was found to have an influence on microsphere size during fabrication, with increased MW leading to an increased microsphere diameter. An inversely proportional relationship was displayed between PLGA MW and mechanical strength of formed scaffolds across loadings for low, intermediate and high MW respectively. Lysozyme release from both microspheres and formed scaffolds showed an initial burst release phase, with both microspheres and scaffolds fabricated using high MW PLGA showing the lowest protein release. Following the initial burst phase, the profiles for each MW followed a similar slow release over 30 days. Overall, the results of this study demonstrate that lysozyme can be successfully incorporated into porous PLGA scaffolds and released over 30 days in vitro, and that varying the MW of the PLGA can be used as a method of altering the physical properties of the resulting scaffolds.

Keywords: bone, microspheres, PLGA, tissue engineering

Procedia PDF Downloads 418
835 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 625
834 Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks

Authors: Gunasekaran Raja, Ramkumar Jayaraman

Abstract:

In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.

Keywords: cross layer network topology, concurrent scheduling, modularity value, network communities and weighted load balancing

Procedia PDF Downloads 250
833 Identification of Shocks from Unconventional Monetary Policy Measures

Authors: Margarita Grushanina

Abstract:

After several prominent central banks including European Central Bank (ECB), Federal Reserve System (Fed), Bank of Japan and Bank of England employed unconventional monetary policies in the aftermath of the financial crisis of 2008-2009 the problem of identification of the effects from such policies became of great interest. One of the main difficulties in identification of shocks from unconventional monetary policy measures in structural VAR analysis is that they often are anticipated, which leads to a non-fundamental MA representation of the VAR model. Moreover, the unconventional monetary policy actions may indirectly transmit to markets information about the future stance of the interest rate, which raises a question of the plausibility of the assumption of orthogonality between shocks from unconventional and conventional policy measures. This paper offers a method of identification that takes into account the abovementioned issues. The author uses factor-augmented VARs to increase the information set and identification through heteroskedasticity of error terms and rank restrictions on the errors’ second moments’ matrix to deal with the cross-correlation of the structural shocks.

Keywords: factor-augmented VARs, identification through heteroskedasticity, monetary policy, structural VARs

Procedia PDF Downloads 336
832 Mechanistic Modelling to De-risk Process Scale-up

Authors: Edwin Cartledge, Jack Clark, Mazaher Molaei-Chalchooghi

Abstract:

The mixing in the crystallization step of active pharmaceutical ingredient manufacturers was studied via advanced modeling tools to enable a successful scale-up. A virtual representation of the vessel was created, and computational fluid dynamics were used to simulate multiphase flow and, thus, the mixing environment within this vessel. The study identified a significant dead zone in the vessel underneath the impeller and found that increasing the impeller speed and power did not improve the mixing. A series of sensitivity analyses found that to improve mixing, the vessel had to be redesigned, and found that optimal mixing could be obtained by adding two extra cylindrical baffles. The same two baffles from the simulated environment were then constructed and added to the process vessel. By identifying these potential issues before starting the manufacture and modifying the vessel to ensure good mixing, this study mitigated a failed crystallization and potential batch disposal, which could have resulted in a significant loss of high-value material.

Keywords: active pharmaceutical ingredient, baffles, computational fluid dynamics, mixing, modelling

Procedia PDF Downloads 85
831 Quasi–Periodicity of Tonic Intervals in Octave and Innovation of Themes in Music Compositions

Authors: R. C. Tyagi

Abstract:

Quasi-periodicity of frequency intervals observed in Shruti based Absolute Scale of Music has been used to graphically identify the Anchor notes ‘Vadi’ and ‘Samvadi’ which are nodal points for expansion, elaboration and iteration of the emotional theme represented by the characteristic tonic arrangement in Raga compositions. This analysis leads to defining the Tonic parameters in the octave including the key-note frequency, tonic intervals’ anchor notes and the on-set and range of quasi-periodicities as exponents of 2. Such uniformity of representation of characteristic data would facilitate computational analysis and synthesis of music compositions and also help develop noise suppression techniques. Criteria for tuning of strings for compatibility with placement of frets on finger boards is discussed. Natural Rhythmic cycles in music compositions are analytically shown to lie between 3 and 126 beats.

Keywords: absolute scale, anchor notes, computational analysis, frets, innovation, noise suppression, Quasi-periodicity, rhythmic cycle, tonic interval, Shruti

Procedia PDF Downloads 293