Search results for: modifiable areal unit problem (MAUP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9238

Search results for: modifiable areal unit problem (MAUP)

5338 A Study of NT-ProBNP and ETCO2 in Patients Presenting with Acute Dyspnoea

Authors: Dipti Chand, Riya Saboo

Abstract:

OBJECTIVES: Early and correct diagnosis may present a significant clinical challenge in diagnosis of patients presenting to Emergency Department with Acute Dyspnoea. The common cause of acute dyspnoea and respiratory distress in Emergency Department are Decompensated Heart Failure (HF), Chronic Obstructive Pulmonary Disease (COPD), Asthma, Pneumonia, Acute Respiratory Distress Syndrome (ARDS), Pulmonary Embolism (PE), and other causes like anaemia. The aim of the study was to measure NT-pro Brain Natriuretic Peptide (BNP) and exhaled End-Tidal Carbon dioxide (ETCO2) in patients presenting with dyspnoea. MATERIAL AND METHODS: This prospective, cross-sectional and observational study was performed at the Government Medical College and Hospital, Nagpur, between October 2019 and October 2021 in patients admitted to the Medicine Intensive Care Unit. Three groups of patients were compared: (1) HFrelated acute dyspnoea group (n = 52), (2) pulmonary (COPD/PE)-related acute dyspnoea group (n = 31) and (3) sepsis with ARDS-related dyspnoea group (n = 13). All patients underwent initial clinical examination with a recording of initial vital parameters along with on-admission ETCO2 measurement, NT-proBNP testing, arterial blood gas analysis, lung ultrasound examination, 2D echocardiography, chest X-rays, and other relevant diagnostic laboratory testing. RESULTS: 96 patients were included in the study. Median NT-proBNP was found to be high for the Heart Failure group (11,480 pg/ml), followed by the sepsis group (780 pg/ml), and pulmonary group had an Nt ProBNP of 231 pg/ml. The mean ETCO2 value was maximum in the pulmonary group (48.610 mmHg) followed by Heart Failure (31.51 mmHg) and the sepsis group (19.46 mmHg). The results were found to be statistically significant (P < 0.05). CONCLUSION: NT-proBNP has high diagnostic accuracy in differentiating acute HF-related dyspnoea from pulmonary (COPD and ARDS)-related acute dyspnoea. The higher levels of ETCO2 help in diagnosing patients with COPD.

Keywords: NT PRO BNP, ETCO2, dyspnoea, lung USG

Procedia PDF Downloads 76
5337 Improved Color-Based K-Mean Algorithm for Clustering of Satellite Image

Authors: Sangeeta Yadav, Mantosh Biswas

Abstract:

In this paper, we proposed an improved color based K-mean algorithm for clustering of satellite Image (SAR). Our method comprises of two stages. The first step is an interactive selection process where users are required to input the number of colors (ncolor), number of clusters, and then they are prompted to select the points in each color cluster. In the second step these points are given as input to K-mean clustering algorithm that clusters the image based on color and Minimum Square Euclidean distance. The proposed method reduces the mixed pixel problem to a great extent.

Keywords: cluster, ncolor method, K-mean method, interactive selection process

Procedia PDF Downloads 297
5336 Isogeometric Topology Optimization in Cracked Structures Design

Authors: Dongkyu Lee, Thanh Banh Thien, Soomi Shin

Abstract:

In the present study, the isogeometric topology optimization is proposed for cracked structures through using Solid Isotropic Material with Penalization (SIMP) as a design model. Design density variables defined in the variable space are used to approximate the element analysis density by the bivariate B-spline basis functions. The mathematical formulation of topology optimization problem solving minimum structural compliance is an alternating active-phase algorithm with the Gauss-Seidel version as an optimization model of optimality criteria. Stiffness and adjoint sensitivity formulations linked to strain energy of cracked structure are proposed in terms of design density variables. Numerical examples demonstrate interactions of topology optimization to structures design with cracks.

Keywords: topology optimization, isogeometric, NURBS, design

Procedia PDF Downloads 492
5335 An Efficient Proxy Signature Scheme Over a Secure Communications Network

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Proxy signature scheme permits an original signer to delegate his/her signing capability to a proxy signer, and then the proxy signer generates a signing message on behalf of the original signer. The two parties must be able to authenticate one another and agree on a secret encryption key, in order to communicate securely over an unreliable public network. Authenticated key agreement protocols have an important role in building secure communications network between the two parties. In this paper, we present a secure proxy signature scheme over an efficient and secure authenticated key agreement protocol based on the discrete logarithm problem.

Keywords: proxy signature, warrant partial delegation, key agreement, discrete logarithm

Procedia PDF Downloads 345
5334 Die Design for Flashless Forging of a Polymer Insulator Fitting

Authors: Pedram Khazaie, Sajjad Moein

Abstract:

In the conventional hot forging of Tongue, which is a fitting for polymer insulator, the material wasted to flash accounts for 20-30% of workpiece. In order to reduce the cost of forged products, this waste material must be minimized. In this study, a flashless forging die is designed and simulated using the finite element method (FEM). A solution to avoid overloading the die with a simple preform is also presented. Moreover, since in flashless forging, burr is formed on the edge of workpiece, a controlled flash forging method is proposed to solve this problem. The simulation results have been validated by experiments; achieving close agreement between simulated and experimental data. It was shown that numerical modeling is helpful in reducing cost and time in the manufacturing process.

Keywords: burr formation, die design, finite element method, flashless forging

Procedia PDF Downloads 158
5333 On the Representation of Actuator Faults Diagnosis and Systems Invertibility

Authors: F. Sallem, B. Dahhou, A. Kamoun

Abstract:

In this work, the main problem considered is the detection and the isolation of the actuator fault. A new formulation of the linear system is generated to obtain the conditions of the actuator fault diagnosis. The proposed method is based on the representation of the actuator as a subsystem connected with the process system in cascade manner. The designed formulation is generated to obtain the conditions of the actuator fault detection and isolation. Detectability conditions are expressed in terms of the invertibility notions. An example and a comparative analysis with the classic formulation illustrate the performances of such approach for simple actuator fault diagnosis by using the linear model of nuclear reactor.

Keywords: actuator fault, Fault detection, left invertibility, nuclear reactor, observability, parameter intervals, system inversion

Procedia PDF Downloads 405
5332 Development Framework Based on Mobile Augmented Reality for Pre-Literacy Kit

Authors: Nazatul Aini Abd Majid, Faridah Yunus, Haslina Arshad, Mohammad Farhan Mohammad Johari

Abstract:

Mobile technology, augmented reality, and game-based learning are some of the key learning technologies that can be fully optimized to promote pre-literacy skills. The problem is how to design an effective pre-literacy kit that utilizes some of the learning technologies. This paper presents a framework based on mobile augmented reality for the development of pre-literacy kit. This pre-literacy kit incorporates three main components which are contents, design, and tools. A prototype of a mobile app based on the three main components was developed for promoting pre-literacy. The results show that the children and teachers gave positive feedbacks after using the mobile app for the pre-literacy.

Keywords: framework, mobile technology, augmented reality, pre-literacy skills

Procedia PDF Downloads 595
5331 A Two Stage Stochastic Mathematical Model for the Tramp Ship Routing with Time Windows Problem

Authors: Amin Jamili

Abstract:

Nowadays, the majority of international trade in goods is carried by sea, and especially by ships deployed in the industrial and tramp segments. This paper addresses routing the tramp ships and determining the schedules including the arrival times to the ports, berthing times at the ports, and the departure times in an operational planning level. In the operational planning level, the weather can be almost exactly forecasted, however in some routes some uncertainties may remain. In this paper, the voyaging times between some of the ports are considered to be uncertain. To that end, a two-stage stochastic mathematical model is proposed. Moreover, a case study is tested with the presented model. The computational results show that this mathematical model is promising and can represent acceptable solutions.

Keywords: routing, scheduling, tram ships, two stage stochastic model, uncertainty

Procedia PDF Downloads 436
5330 Dwelling in the Built Environment: The Resilience by Design in Modular Thinking toward an Adaptive Alternatives

Authors: Tzen-Ying Ling

Abstract:

Recently, the resilience of dwellings in urban areas has been deliberated, as to accommodate the growing demand for changing the demography and rapid urbanization. The need to incorporate sustainability and cleaner production thinking have intensified to mitigate climate risks and satisfy the demand for housing. The modular thinking satisfies both the pressing call for fast-tracked housing stocks; while meeting the goal of more sustainable production. In the other side, the importance of the dwelling as a podium for well-being and social connectedness are sought to explore the key human/environment design thinking for the modular system in dwelling. We argue the best practice incorporates the concept of systemic components thinking. The fieldwork reported in this paper illustrates the process of the case study in a modular dwelling unit prototype development; focusing on the systemic frame system design process and adjustment recommendation hereafter. Using a case study method, the study identified that: (1) inclusive human dimensional factoring through systemic design thinking results in affordable implementations possibilities. (2) The environmental dimension encourages the place-based solution suited for the locality and the increasing demand for dwelling in the urban system. (3) Prototype design consideration avails module system component as dwelling construction alternative. (4) Building code often acts as an inhibitor for such dwelling units by the restriction in lot sizes and units placement. The demand for fast-track dwelling construction and cleaner production decisively outweighs the code inhibition; we further underscored the sustainability implication of the alternative prototype as the core of this study. The research suggests that modular thinking results in a resilient solution suited for the locality and the increasing demand for dwelling in the urban system.

Keywords: system prototype, urban resilience, human/environment dimension, modular thinking, dwelling alternative

Procedia PDF Downloads 174
5329 Intelligent Ambulance with Advance Features of Traffic Management and Telecommunication

Authors: Mamatha M. N.

Abstract:

Traffic problems, congested traffic, and flow management were recognized as major problems mostly in all the areas, which have caused a problem for the ambulance which carries the emergency patient. The proposed paper aims in the development of ambulance which reaches the nearby hospital faster even in heavy traffic scenario. This process is activated by implementing hardware in an ambulance as well as in traffic post thus allowing a smooth flow to the ambulance to reach the hospital in time. 1) The design of the vehicle to have a communication between ambulance and traffic post. 2)Electronic Health Record with Data-acquisition system 3)Telemetry of acquired biological parameters to the nearest hospital. Thus interfacing all these three different modules and integrating them on the ambulance could reach the hospital earlier than the present ambulance. The system is accurate and efficient of 99.8%.

Keywords: bio-telemetry, data acquisition, patient database, automatic traffic control

Procedia PDF Downloads 315
5328 Behavior Consistency Analysis for Workflow Nets Based on Branching Processes

Authors: Wang Mimi, Jiang Changjun, Liu Guanjun, Fang Xianwen

Abstract:

Loop structure often appears in the business process modeling, analyzing the consistency of corresponding workflow net models containing loop structure is a problem, the existing behavior consistency methods cannot analyze effectively the process models with the loop structure. In the paper, by analyzing five kinds of behavior relations of transitions, a three-dimensional figure and two-dimensional behavior relation matrix are proposed. Based on this, analysis method of behavior consistency of business process based on Petri net branching processes is proposed. Finally, an example is given out, which shows the method is effective.

Keywords: workflow net, behavior consistency measures, loop, branching process

Procedia PDF Downloads 388
5327 Process of Revitalization of the City Centres in Poland: The Problem of Cooperation between Sectors

Authors: Ewa M. Boryczka

Abstract:

Contemporary city is a subject to rapid economic and social changes. Therefore it requires an active policy designed to meet the diverse needs of their residents, build competitive position and capacity to compete with other cities. Competitiveness of cities depends largely on their resources, but also to a large extent, on the policies and performance of local authorities. Cooperation with private and social sectors also plays an important role, as it affects the use of resources and builds an advantage over other cities. The subject of this article is city's contemporary problems of development with particular emphasis on central areas. This issue is a starting point for reflection on the process of urban regeneration in medium size cities in Poland, as well as cooperation between various actors and their roles in the revitalization processes of Polish cities' centres.

Keywords: city, cooperation between sectors, crisis of city centres, revitalization

Procedia PDF Downloads 447
5326 Implicit Off-Grid Block Method for Solving Fourth and Fifth Order Ordinary Differential Equations Directly

Authors: Olusola Ezekiel Abolarin, Gift E. Noah

Abstract:

This research work considered an innovative procedure to numerically approximate higher-order Initial value problems (IVP) of ordinary differential equations (ODE) using the Legendre polynomial as the basis function. The proposed method is a half-step, self-starting Block integrator employed to approximate fourth and fifth order IVPs without reduction to lower order. The method was developed through a collocation and interpolation approach. The basic properties of the method, such as convergence, consistency and stability, were well investigated. Several test problems were considered, and the results compared favorably with both exact solutions and other existing methods.

Keywords: initial value problem, ordinary differential equation, implicit off-grid block method, collocation, interpolation

Procedia PDF Downloads 84
5325 Learning to Translate by Learning to Communicate to an Entailment Classifier

Authors: Szymon Rutkowski, Tomasz Korbak

Abstract:

We present a reinforcement-learning-based method of training neural machine translation models without parallel corpora. The standard encoder-decoder approach to machine translation suffers from two problems we aim to address. First, it needs parallel corpora, which are scarce, especially for low-resource languages. Second, it lacks psychological plausibility of learning procedure: learning a foreign language is about learning to communicate useful information, not merely learning to transduce from one language’s 'encoding' to another. We instead pose the problem of learning to translate as learning a policy in a communication game between two agents: the translator and the classifier. The classifier is trained beforehand on a natural language inference task (determining the entailment relation between a premise and a hypothesis) in the target language. The translator produces a sequence of actions that correspond to generating translations of both the hypothesis and premise, which are then passed to the classifier. The translator is rewarded for classifier’s performance on determining entailment between sentences translated by the translator to disciple’s native language. Translator’s performance thus reflects its ability to communicate useful information to the classifier. In effect, we train a machine translation model without the need for parallel corpora altogether. While similar reinforcement learning formulations for zero-shot translation were proposed before, there is a number of improvements we introduce. While prior research aimed at grounding the translation task in the physical world by evaluating agents on an image captioning task, we found that using a linguistic task is more sample-efficient. Natural language inference (also known as recognizing textual entailment) captures semantic properties of sentence pairs that are poorly correlated with semantic similarity, thus enforcing basic understanding of the role played by compositionality. It has been shown that models trained recognizing textual entailment produce high-quality general-purpose sentence embeddings transferrable to other tasks. We use stanford natural language inference (SNLI) dataset as well as its analogous datasets for French (XNLI) and Polish (CDSCorpus). Textual entailment corpora can be obtained relatively easily for any language, which makes our approach more extensible to low-resource languages than traditional approaches based on parallel corpora. We evaluated a number of reinforcement learning algorithms (including policy gradients and actor-critic) to solve the problem of translator’s policy optimization and found that our attempts yield some promising improvements over previous approaches to reinforcement-learning based zero-shot machine translation.

Keywords: agent-based language learning, low-resource translation, natural language inference, neural machine translation, reinforcement learning

Procedia PDF Downloads 128
5324 New Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques, and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then, dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is an arbitrary nonempty closed subset of the real numbers. Then, the dynamic inequalities on time scales have received a lot of attention in the literature and has become a major field in pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on Hardy and Coposon inequalities, using Steklov operator on time scale in double integrals to obtain special cases of time-scale inequalities of Hardy and Copson on high dimensions. The advantage of this study is that it uses the one-dimensional classical Hardy inequality to obtain higher dimensional on time scale versions that will be applied in the solution of the Cauchy problem for the wave equation. In addition, the obtained inequalities have various applications involving discontinuous domains such as bug populations, phytoremediation of metals, wound healing, maximization problems. The proof can be done by introducing restriction on the operator in several cases. The concepts in time scale version such as time scales calculus will be used that allows to unify and extend many problems from the theories of differential and of difference equations. In addition, using chain rule, and some properties of multiple integrals on time scales, some theorems of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of hardy, inequality of coposon, steklov operator

Procedia PDF Downloads 95
5323 Numerical Analysis and Parametric Study of Granular Anchor Pile on Expansive Soil Using Finite Element Method: Case of Addis Ababa, Bole Sub-City

Authors: Abdurahman Anwar Shfa

Abstract:

Addis Ababa is among the fastest-growing urban areas in the country. There are many new constructions of public and private condominiums and large new low rising residential buildings for residents. But the wide range of heaving problems of expansive soil in the city become a major difficulty for the construction sector, especially in low rising buildings, by causing different problems such as distortion and cracking of floor slabs, cracks in grade beams, and walls, jammed or misaligned Doors and Windows; failure of blocks supporting grade beams. Hence an attractive and economical design solution may be required for such type of problem. Therefore, this research works to publicize a recent innovation called the Granular Anchor Pile system for the reduction of the heave effect of expansive soil. This research is written for the objective of numerical investigation of the behavior of Granular Anchor Pile under the heave using Finite element analysis PLAXIS 3D program by means of studying the effect of different parameters like length of the pile, diameter of pile, and pile group by applying prescribed displacement of 10% of pile diameter at the center of granular pile anchor. An additional objective is examining the suitability of Granular Anchor Pile as an alternative solution for heave problems in expansive soils mostly for low rising buildings found in Addis Ababa City, especially in Bole Sub-City, by considering different factors such as the local availability of construction materials, economy for the construction, installation process condition, environmental benefit, time consumption and performance of the pile. Accordingly, the performance of the pile improves when the length of the pile increases. This is due to an increase in the self-weight of the pile and friction mobilized between the pile and soil interface. Additionally, the uplift capacity of the pile decreases when increasing the pile diameter and spacing between the piles in the group due to a reduction in the number of piles in the group. But, few cases show that the uplift capacity of the pile increases with increasing the pile diameter for a constant number of piles in the group and increasing the spacing between the pile and in the case of single pile capacity. This is due to the increment of piles' self-weight and surface area of the pile group and also the decrement of stress overlap in the soil caused by piles respectively. According to the suitability analysis, it is observed that Granular Anchor Pile is sensible or practical to apply for the actual problem of Expansive soil in a low rising building constructed in the country because of its convenience for all considerations.

Keywords: expansive soil, granular anchor pile, PLAXIS, suitability analysis

Procedia PDF Downloads 35
5322 A Watermarking Signature Scheme with Hidden Watermarks and Constraint Functions in the Symmetric Key Setting

Authors: Yanmin Zhao, Siu Ming Yiu

Abstract:

To claim the ownership for an executable program is a non-trivial task. An emerging direction is to add a watermark to the program such that the watermarked program preserves the original program’s functionality and removing the watermark would heavily destroy the functionality of the watermarked program. In this paper, the first watermarking signature scheme with the watermark and the constraint function hidden in the symmetric key setting is constructed. The scheme uses well-known techniques of lattice trapdoors and a lattice evaluation. The watermarking signature scheme is unforgeable under the Short Integer Solution (SIS) assumption and satisfies other security requirements such as the unremovability security property.

Keywords: short integer solution (SIS) problem, symmetric-key setting, watermarking schemes, watermarked signatures

Procedia PDF Downloads 133
5321 Cotton Fiber Quality Improvement by Introducing Sucrose Synthase (SuS) Gene into Gossypium hirsutum L.

Authors: Ahmad Ali Shahid, Mukhtar Ahmed

Abstract:

The demand for long staple fiber having better strength and length is increasing with the introduction of modern spinning and weaving industry in Pakistan. Work on gene discovery from developing cotton fibers has helped to identify dozens of genes that take part in cotton fiber development and several genes have been characterized for their role in fiber development. Sucrose synthase (SuS) is a key enzyme in the metabolism of sucrose in a plant cell, in cotton fiber it catalyzes a reversible reaction, but preferentially converts sucrose and UDP into fructose and UDP-glucose. UDP-glucose (UDPG) is a nucleotide sugar act as a donor for glucose residue in many glycosylation reactions and is essential for the cytosolic formation of sucrose and involved in the synthesis of cell wall cellulose. The study was focused on successful Agrobacterium-mediated stable transformation of SuS gene in pCAMBIA 1301 into cotton under a CaMV35S promoter. Integration and expression of the gene were confirmed by PCR, GUS assay, and real-time PCR. Young leaves of SuS overexpressing lines showed increased total soluble sugars and plant biomass as compared to non-transgenic control plants. Cellulose contents from fiber were significantly increased. SEM analysis revealed that fibers from transgenic cotton were highly spiral and fiber twist number increased per unit length when compared with control. Morphological data from field plants showed that transgenic plants performed better in field conditions. Incorporation of genes related to cotton fiber length and quality can provide new avenues for fiber improvement. The utilization of this technology would provide an efficient import substitution and sustained production of long-staple fiber in Pakistan to fulfill the industrial requirements.

Keywords: agrobacterium-mediated transformation, cotton fiber, sucrose synthase gene, staple length

Procedia PDF Downloads 233
5320 An Exact Algorithm for Location–Transportation Problems in Humanitarian Relief

Authors: Chansiri Singhtaun

Abstract:

This paper proposes a mathematical model and examines the performance of an exact algorithm for a location–transportation problems in humanitarian relief. The model determines the number and location of distribution centers in a relief network, the amount of relief supplies to be stocked at each distribution center and the vehicles to take the supplies to meet the needs of disaster victims under capacity restriction, transportation and budgetary constraints. The computational experiments are conducted on the various sizes of problems that are generated. Branch and bound algorithm is applied for these problems. The results show that this algorithm can solve problem sizes of up to three candidate locations with five demand points and one candidate location with up to twenty demand points without premature termination.

Keywords: disaster response, facility location, humanitarian relief, transportation

Procedia PDF Downloads 451
5319 Experimental and Numerical Studies on Hydrogen Behavior in a Small-Scale Container with Passive Autocatalytic Recombiner

Authors: Kazuyuki Takase, Yoshihisa Hiraki, Gaku Takase, Isamu Kudo

Abstract:

One of the most important issue is to ensure the safety of long-term waste storage containers in which fuel debris and radioactive materials are accumulated. In this case, hydrogen generated by water decomposition by radiation is accumulated in the container for a long period of time, so it is necessary to reduce the concentration of hydrogen in the container. In addition, a condition that any power supplies from the outside of the container are unnecessary is requested. Then, radioactive waste storage containers with the passive autocatalytic recombiner (PAR) would be effective. The radioactive waste storage container with PAR was used for moving the fuel debris of the Three Mile Island Unit 2 to the storage location. However, the effect of PAR is not described in detail. Moreover, the reduction of hydrogen concentration during the long-term storage period was performed by the venting system, which was installed on the top of the container. Therefore, development of a long-term storage container with PAR was started with the aim of safely storing fuel debris picked up at the Fukushima Daiichi Nuclear Power Plant for a long period of time. A fundamental experiment for reducing the concentration of hydrogen which generates in a nuclear waste long-term storage container was carried out using a small-scale container with PAR. Moreover, the circulation flow behavior of hydrogen in the small-scale container resulting from the natural convection by the decay heat was clarified. In addition, preliminary numerical analyses were performed to predict the experimental results regarding the circulation flow behavior and the reduction of hydrogen concentration in the small-scale container. From the results of the present study, the validity of the container with PAR was experimentally confirmed on the reduction of hydrogen concentration. In addition, it was predicted numerically that the circulation flow behavior of hydrogen in the small-scale container is blocked by steam which generates by chemical reaction of hydrogen and oxygen.

Keywords: hydrogen behavior, reduction of concentration, long-term storage container, small-scale, PAR, experiment, analysis

Procedia PDF Downloads 163
5318 Multiple Relaxation Times in the Gibbs Ensemble Monte Carlo Simulation of Phase Separation

Authors: Bina Kumari, Subir K. Sarkar, Pradipta Bandyopadhyay

Abstract:

The autocorrelation function of the density fluctuation is studied in each of the two phases in a Gibbs Ensemble Monte Carlo (GEMC) simulation of the problem of phase separation for a square well potential with various values of its range. We find that the normalized autocorrelation function is described very well as a linear combination of an exponential function with a time scale τ₂ and a stretched exponential function with a time scale τ₁ and an exponent α. Dependence of (α, τ₁, τ₂) on the parameters of the GEMC algorithm and the range of the square well potential is investigated and interpreted. We also analyse the issue of how to choose the parameters of the GEMC simulation optimally.

Keywords: autocorrelation function, density fluctuation, GEMC, simulation

Procedia PDF Downloads 189
5317 Polycode Texts in Communication of Antisocial Groups: Functional and Pragmatic Aspects

Authors: Ivan Potapov

Abstract:

Background: The aim of this paper is to investigate poly code texts in the communication of youth antisocial groups. Nowadays, the notion of a text has numerous interpretations. Besides all the approaches to defining a text, we must take into account semiotic and cultural-semiotic ones. Rapidly developing IT, world globalization, and new ways of coding of information increase the role of the cultural-semiotic approach. However, the development of computer technologies leads also to changes in the text itself. Polycode texts play a more and more important role in the everyday communication of the younger generation. Therefore, the research of functional and pragmatic aspects of both verbal and non-verbal content is actually quite important. Methods and Material: For this survey, we applied the combination of four methods of text investigation: not only intention and content analysis but also semantic and syntactic analysis. Using these methods provided us with information on general text properties, the content of transmitted messages, and each communicants’ intentions. Besides, during our research, we figured out the social background; therefore, we could distinguish intertextual connections between certain types of polycode texts. As the sources of the research material, we used 20 public channels in the popular messenger Telegram and data extracted from smartphones, which belonged to arrested members of antisocial groups. Findings: This investigation let us assert that polycode texts can be characterized as highly intertextual language unit. Moreover, we could outline the classification of these texts based on communicants’ intentions. The most common types of antisocial polycode texts are a call to illegal actions and agitation. What is more, each type has its own semantic core: it depends on the sphere of communication. However, syntactic structure is universal for most of the polycode texts. Conclusion: Polycode texts play important role in online communication. The results of this investigation demonstrate that in some social groups using these texts has a destructive influence on the younger generation and obviously needs further researches.

Keywords: text, polycode text, internet linguistics, text analysis, context, semiotics, sociolinguistics

Procedia PDF Downloads 132
5316 Entropy Generation of Unsteady Reactive Hydromagnetic Generalized Couette Fluid Flow of a Two-Step Exothermic Chemical Reaction Through a Channel

Authors: Rasaq Kareem, Jacob Gbadeyan

Abstract:

In this study, analysis of the entropy generation of an unsteady reactive hydromagnetic generalized couette fluid flow of a two-step exothermic chemical reaction through a channel with isothermal wall temperature under the influence of different chemical kinetics namely: Sensitized, Arrhenius and Bimolecular kinetics was investigated. The modelled nonlinear dimensionless equations governing the fluid flow were simplified and solved using the combined Laplace Differential Transform Method (LDTM). The effects of fluid parameters associated with the problem on the fluid temperature, entropy generation rate and Bejan number were discussed and presented through graphs.

Keywords: couette, entropy, exothermic, unsteady

Procedia PDF Downloads 515
5315 Inpatient Neonatal Deaths in Rural Uganda: A Retrospective Comparative Mortality Study of Labour Ward versus Community Admissions

Authors: Najade Sheriff, Malaz Elsaddig, Kevin Jones

Abstract:

Background: Death in the first month of life accounts for an increasing proportion of under-five mortality. Advancement to reduce this number is being made across the globe; however, progress is slowest in sub-Saharan Africa. Objectives: The study aims to identify differences between neonatal deaths of inpatient babies born in a hospital facility in rural Uganda to those of neonates admitted from the community and to explore whether they can be used to risk stratify neonatal admissions. Results: A retrospective chart review was conducted on records for neonates admitted to the Special Care Baby Unit (SCBU) Kitovu Hospital from 1st July 2016 to 21st July 2017. A total of 442 babies were admitted and the overall neonatal mortality was 24.8% (40% inpatient, 37% community, 23% hospital referrals). 40% of deaths occurred within 24 hours of admission and the majority were male (63%). 43% of babies were hypothermic upon admission, a significantly greater proportion of which were inpatient babies born in labour ward (P=0.0025). Intrapartum related death accounted for ½ of all inpatient babies whereas complications of prematurity were the predominant cause of death in the community group (37%). Severe infection does not seem like a significant factor of mortality for inpatients (2%) as it does for community admissions (29%). Furthermore, with 52.5% of community admissions weighing < 1500g, very low birth weight (VLBW) may be a significant risk factor for community neonatal death. Conclusion: The neonatal mortality rate in this study is high, and the leading causes of death are all largely preventable. A high rate of inpatient birth asphyxiation indicates the need for good quality facility-based perinatal care as well as a greater focus on the management of hypothermia, such as Kangaroo care. Moreover, a reduction in preterm deliveries is necessary to reduce associated comorbidities, and monitoring for signs of infection is especially important for community admissions.

Keywords: community, mortality, newborn, Uganda

Procedia PDF Downloads 187
5314 Finding Optimal Solutions to Management Problems with the use of Econometric and Multiobjective Programming

Authors: M. Moradi Dalini, M. R. Talebi

Abstract:

This research revolves around a technical method according to combines econometric and multiobjective programming to select and obtain optimal solutions to management problems. It is taken for a generation that; it is important to analyze which combination of values of the explanatory variables -in an econometric method- would point to the simultaneous achievement of the best values of the response variables. In this case, if a certain degree of conflict is viewed among the response variables, we suggest a multiobjective method in order to the results obtained from a regression analysis. In fact, with the use of a multiobjective method, we will have the best decision about the conflicting relationship between the response variables and the optimal solution. The combined multiobjective programming and econometrics benefit is an assessment of a balanced “optimal” situation among them because a find of information can hardly be extracted just by econometric techniques.

Keywords: econometrics, multiobjective optimization, management problem, optimization

Procedia PDF Downloads 82
5313 Social Business: Opportunities and Challenges

Authors: Muhammad Mustafizur Rahaman

Abstract:

Social business is a new concept in the field of Business Economics and Capitalist Economy. It has increased the importance in economic and social development in emerging economies. Professor Muhammad Yunus is the founding father of the notion. While conventional business underscores profit maximization as a core business principle, social business calls for addressing social problems at the expense of profit. This underlying principle gives social business advantageous position over conventional businesses to serve those who live at the bottom of the pyramid. It also poses grave challenges to the social business because social business sacrifices profit at one hand and seeks financial sustainability on the other. For the sake of its financial sustainability, the social business might increase the price of its product or service which might lower its social impact, thus, makes the business self-defeating. Therefore, social business should be more innovative in every business process including production, marketing, and management. Otherwise, the business is unlikely to be driven out from the society.

Keywords: innovativeness, self-defeat, social business, social problem

Procedia PDF Downloads 619
5312 Hemispheric Locus and Gender Predict the Delay between the Moment of Stroke and Hospitalization

Authors: D. Anderlini, G. Wallis

Abstract:

Background: The number of people experiencing stroke is steadily increasing due to changes in diet and lifestyle, to longer life expectancy resulting in older population, to higher survival rates as a consequence of improvements during the acute phase. This study considers what risk factors might contribute to delayed entry to hospital for treatment. Methods: We analyzed data from 2472 patients admitted to the Stroke Unit of the Royal Brisbane Women's Hospital, Australia, between 2002 to 2011. Results: Previous studies have reported that factors which can contribute to delay include the patient’s age, the time of day, physical location, visit the GP instead of going to the emergency, means of transport, severity of symptoms and type of stroke. Contrary to findings of other studies, we found a strong correlation between side of lesion and delay in admission: patients with right hemisphere lesions had an average delay of 3.78 days, while patients with left hemisphere lesions had an average delay of 1.49 days. Damage to the right hemisphere generally ends in motor impairment in the non-dominant hand and no speech impediment. In contrast, left hemisphere lesions can result in deficit to; dominant hand function and aphasia which will be noticed even if their impact on performance is relatively minor. A finding which goes against many previous studies, is the fact that women get to the hospital much sooner than men, showing an average delay of 0.92 days in women vs. 3.36 days in men. Conclusion: Acute surgical-pharmacological therapies are most effective if applied immediately after stroke. Hence delays to admission can be crucial to the degree of recovery. The tendency of patients to overlook symptoms of right hemisphere lesion should be the target of information campaigns both for the general public and GPs. Why do men go to hospital so late? We don't know yet! Nevertheless an awareness plan specifically direct to male population should be on the agenda of Health Departments.

Keywords: gender, admission delay, stroke location, bioinformatics, biomedicine

Procedia PDF Downloads 230
5311 Numerical Investigation of Natural Convection of Pine, Olive and Orange Leaves

Authors: Ali Reza Tahavvor, Saeed Hosseini, Nazli Jowkar, Behnam Amiri

Abstract:

Heat transfer of leaves is a crucial factor in optimal operation of metabolic functions in plants. In order to quantify this phenomenon in different leaves and investigate the influence of leaf shape on heat transfer, natural convection for pine, orange and olive leaves was simulated as representatives of different groups of leaf shapes. CFD techniques were used in this simulation with the purpose to calculate heat transfer of leaves in similar environmental conditions. The problem was simulated for steady state and three-dimensional conditions. From obtained results, it was concluded that heat fluxes of all three different leaves are almost identical, however, total rate of heat transfer have highest and lowest values for orange leaves and pine leaves, respectively.

Keywords: computational fluid dynamic, heat flux, heat transfer, natural convection

Procedia PDF Downloads 362
5310 Topological Language for Classifying Linear Chord Diagrams via Intersection Graphs

Authors: Michela Quadrini

Abstract:

Chord diagrams occur in mathematics, from the study of RNA to knot theory. They are widely used in theory of knots and links for studying the finite type invariants, whereas in molecular biology one important motivation to study chord diagrams is to deal with the problem of RNA structure prediction. An RNA molecule is a linear polymer, referred to as the backbone, that consists of four types of nucleotides. Each nucleotide is represented by a point, whereas each chord of the diagram stands for one interaction for Watson-Crick base pairs between two nonconsecutive nucleotides. A chord diagram is an oriented circle with a set of n pairs of distinct points, considered up to orientation preserving diffeomorphisms of the circle. A linear chord diagram (LCD) is a special kind of graph obtained cutting the oriented circle of a chord diagram. It consists of a line segment, called its backbone, to which are attached a number of chords with distinct endpoints. There is a natural fattening on any linear chord diagram; the backbone lies on the real axis, while all the chords are in the upper half-plane. Each linear chord diagram has a natural genus of its associated surface. To each chord diagram and linear chord diagram, it is possible to associate the intersection graph. It consists of a graph whose vertices correspond to the chords of the diagram, whereas the chord intersections are represented by a connection between the vertices. Such intersection graph carries a lot of information about the diagram. Our goal is to define an LCD equivalence class in terms of identity of intersection graphs, from which many chord diagram invariants depend. For studying these invariants, we introduce a new representation of Linear Chord Diagrams based on a set of appropriate topological operators that permits to model LCD in terms of the relations among chords. Such set is composed of: crossing, nesting, and concatenations. The crossing operator is able to generate the whole space of linear chord diagrams, and a multiple context free grammar able to uniquely generate each LDC starting from a linear chord diagram adding a chord for each production of the grammar is defined. In other words, it allows to associate a unique algebraic term to each linear chord diagram, while the remaining operators allow to rewrite the term throughout a set of appropriate rewriting rules. Such rules define an LCD equivalence class in terms of the identity of intersection graphs. Starting from a modelled RNA molecule and the linear chord, some authors proposed a topological classification and folding. Our LCD equivalence class could contribute to the RNA folding problem leading to the definition of an algorithm that calculates the free energy of the molecule more accurately respect to the existing ones. Such LCD equivalence class could be useful to obtain a more accurate estimate of link between the crossing number and the topological genus and to study the relation among other invariants.

Keywords: chord diagrams, linear chord diagram, equivalence class, topological language

Procedia PDF Downloads 201
5309 Reliability Evidence of the Child Behavior Checklist (CBCL) Based on a Chinese Sample

Authors: Zhidong Zhang, Zhi-Chao Zhang, Georgiana Duarte

Abstract:

The Chinese version of the Child Behavior Checklist (CBCL) is the one of the Achenbach systems of empirically based assessment (ASEBA) scales, by which behavioral and emotional problems of early adolescents were examined. In order to further understand the robustness of the scale, its reliability has been examined. CBCL consists of 8 problems to measure internalizing, externalizing and social problems. In internalizing problem, there are Anxious, Withdrawn and Somatic Complaints. In this study, as an example, we only examined the anxious aspect which consisted of 13 questions. Cronbach alpha and factor analysis methods were used to examine the reliability of the scale. The result indicated that Cronbach alpha value was above 0.80.

Keywords: anxious/depressed problems, ASEBA, CBCL, Cronbach Alpha, reliability

Procedia PDF Downloads 463