Search results for: Human Resource Management (HRM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18129

Search results for: Human Resource Management (HRM)

7809 Nitrogen and Potassium Fertilizer Response on Growth and Yield of Hybrid Luffa –Naga F1 Variety

Authors: D. R. T. N. K. Dissanayake, H. M. S. K. Herath, H. K. S. G. Gunadasa, P. Weerasinghe

Abstract:

Luffa is a tropical and subtropical vegetable, belongs to family Cucurbiteceae. It is predominantly monoecious in sex expression and provides an ample scope for utilization of hybrid vigor. Hybrid varieties develop through open pollination, produce higher yields due to its hybrid vigor. Naga F1 hybrid variety consists number of desirable traits other than higher yield such as strong and vigorous plants, fruits with long deep ridges, attractive green color fruits ,better fruit weight, length and early maturity compared to the local Luffa cultivars. Unavailability of fertilizer recommendations for hybrid cucurbit vegetables leads to an excess fertilizer application causing a vital environmental issue that creates undesirable impacts on nature and the human health. Main Objective of this research is to determine effect of different nitrogen and potassium fertilizer rates on growth and yield of Naga F1 Variety. Other objectives are, to evaluate specific growth parameters and yield, to identify the optimum nitrogen and potassium fertilizer levels based on growth and yield of hybrid Luffa variety. As well as to formulate the general fertilizer recommendation for hybrid Luffa -Naga F1 variety.

Keywords: hybrid, nitrogen, phosphorous, potassium

Procedia PDF Downloads 593
7808 Shaping Lexical Concept of 'Mage' through Image Schemas in Dragon Age 'Origins'

Authors: Dean Raiyasmi, Elvi Citraresmana, Sutiono Mahdi

Abstract:

Language shapes the human mind and its concept toward things. Using image schemas, in nowadays technology, even AI (artificial intelligence) can concept things in response to their creator negativity or positivity. This is reflected inside one of the most selling game around the world in 2012 called Dragon Age Origins. The AI in form of NPC (Non-Playable Character) inside the game reflects on the creator of the game on negativity or positivity toward the lexical concept of mage. Through image schemas, shaping the lexical concept of mage deemed possible and proved the negativity or positivity creator of the game toward mage. This research analyses the cognitive-semantic process of image schema and shaping the concept of ‘mage’ by describing kinds of image schemas exist in the Dragon Age Origin Game. This research is also aimed to analyse kinds of image schemas and describing the image schemas which shaping the concept of ‘mage’ itself. The methodology used in this research is qualitative where participative observation is employed with five stages and documentation. The results shows that there are four image schemas exist in the game and those image schemas shaping the lexical concept of ‘mage’.

Keywords: cognitive semantic, image-schema, conceptual metaphor, video game

Procedia PDF Downloads 438
7807 Using Hidden Markov Chain for Improving the Dependability of Safety-Critical Wireless Sensor Networks

Authors: Issam Alnader, Aboubaker Lasebae, Rand Raheem

Abstract:

Wireless sensor networks (WSNs) are distributed network systems used in a wide range of applications, including safety-critical systems. The latter provide critical services, often concerned with human life or assets. Therefore, ensuring the dependability requirements of Safety critical systems is of paramount importance. The purpose of this paper is to utilize the Hidden Markov Model (HMM) to elongate the service availability of WSNs by increasing the time it takes a node to become obsolete via optimal load balancing. We propose an HMM algorithm that, given a WSN, analyses and predicts undesirable situations, notably, nodes dying unexpectedly or prematurely. We apply this technique to improve on C. Lius’ algorithm, a scheduling-based algorithm which has served to improve the lifetime of WSNs. Our experiments show that our HMM technique improves the lifetime of the network, achieved by detecting nodes that die early and rebalancing their load. Our technique can also be used for diagnosis and provide maintenance warnings to WSN system administrators. Finally, our technique can be used to improve algorithms other than C. Liu’s.

Keywords: wireless sensor networks, IoT, dependability of safety WSNs, energy conservation, sleep awake schedule

Procedia PDF Downloads 100
7806 A Review: Carotenoids a Biologically Important Bioactive Compound

Authors: Aarti Singh, Anees Ahmad

Abstract:

Carotenoids comprise a group of isoprenoid pigments. Carotenes, xanthophylls and their derivatives have been found to play an important role in all living beings through foods, neutraceuticals and pharmaceuticals. α-carotene, β-carotene and β-cryptoxanthin play a vital role in humans to provide vitamin A source for the growth, development and proper functioning of immune system and vision. They are very crucial for plants and humans as they protect from photooxidative damage and are excellent antioxidants quenching singlet molecular oxygen and peroxyl radicals. Diet including more intake of carotenoids results in reduced threat of various chronic diseases such as cancer (lung, breast, prostate, colorectal and ovarian cancers) and coronary heart diseases. The blue light filtering efficiency of the carotenoids in liposomes have been reported to be maximum in lutein followed by zeaxanthin, β-carotene and lycopene. Lycopene play a vital role for the protection from CVD. Lycopene in serum is directly related to reduced risk of osteoporosis in postmenopausal women. Carotenoids have the major role in the treatment of skin disorders. There is a need to identify and isolate novel carotenoids from diverse natural sources for human health benefits.

Keywords: antioxidants, carotenoids, neutraceuticals, osteoporosis, pharmaceuticals

Procedia PDF Downloads 361
7805 A Proposed Treatment Protocol for the Management of Pars Interarticularis Pathology in Children and Adolescents

Authors: Paul Licina, Emma M. Johnston, David Lisle, Mark Young, Chris Brady

Abstract:

Background: Lumbar pars pathology is a common cause of pain in the growing spine. It can be seen in young athletes participating in at-risk sports and can affect sporting performance and long-term health due to its resistance to traditional management. There is a current lack of consensus of classification and treatment for pars injuries. Previous systems used CT to stage pars defects but could not assess early stress reactions. A modified classification is proposed that considers findings on MRI, significantly improving early treatment guidance. The treatment protocol is designed for patients aged 5 to 19 years. Method: Clinical screening identifies patients with a low, medium, or high index of suspicion for lumbar pars injury using patient age, sport participation and pain characteristics. MRI of the at-risk cohort enables augmentation of existing CT-based classification while avoiding ionising radiation. Patients are classified into five categories based on MRI findings. A type 0 lesion (stress reaction) is present when CT is normal and MRI shows high signal change (HSC) in the pars/pedicle on T2 images. A type 1 lesion represents the ‘early defect’ CT classification. The group previously referred to as a 'progressive stage' defect on CT can be split into 2A and 2B categories. 2As have HSC on MRI, whereas 2Bs do not. This distinction is important with regard to healing potential. Type 3 lesions are terminal stage defects on CT, characterised by pseudarthrosis. MRI shows no HSC. Results: Stress reactions (type 0) and acute fractures (1 and 2a) can heal and are treated in a custom-made hard brace for 12 weeks. It is initially worn 23 hours per day. At three weeks, patients commence basic core rehabilitation. At six weeks, in the absence of pain, the brace is removed for sleeping. Exercises are progressed to positions of daily living. Patients with continued pain remain braced 23 hours per day without exercise progression until becoming symptom-free. At nine weeks, patients commence supervised exercises out of the brace for 30 minutes each day. This allows them to re-learn muscular control without rigid support of the brace. At 12 weeks, bracing ceases and MRI is repeated. For patients with near or complete resolution of bony oedema and healing of any cortical defect, rehabilitation is focused on strength and conditioning and sport-specific exercise for the full return to activity. The length of this final stage is approximately nine weeks but depends on factors such as development and level of sports participation. If significant HSC remains on MRI, CT scan is considered to definitively assess cortical defect healing. For these patients, return to high-risk sports is delayed for up to three months. Chronic defects (2b and 3) cannot heal and are not braced, and rehabilitation follows traditional protocols. Conclusion: Appropriate clinical screening and imaging with MRI can identify pars pathology early. In those with potential for healing, we propose hard bracing and appropriate rehabilitation as part of a multidisciplinary management protocol. The validity of this protocol will be tested in future studies.

Keywords: adolescents, MRI classification, pars interticularis, treatment protocol

Procedia PDF Downloads 153
7804 Leukocyte Detection Using Image Stitching and Color Overlapping Windows

Authors: Lina, Arlends Chris, Bagus Mulyawan, Agus B. Dharmawan

Abstract:

Blood cell analysis plays a significant role in the diagnosis of human health. As an alternative to the traditional technique conducted by laboratory technicians, this paper presents an automatic white blood cell (leukocyte) detection system using Image Stitching and Color Overlapping Windows. The advantage of this method is to present a detection technique of white blood cells that are robust to imperfect shapes of blood cells with various image qualities. The input for this application is images from a microscope-slide translation video. The preprocessing stage is performed by stitching the input images. First, the overlapping parts of the images are determined, then stitching and blending processes of two input images are performed. Next, the Color Overlapping Windows is performed for white blood cell detection which consists of color filtering, window candidate checking, window marking, finds window overlaps, and window cropping processes. Experimental results show that this method could achieve an average of 82.12% detection accuracy of the leukocyte images.

Keywords: color overlapping windows, image stitching, leukocyte detection, white blood cell detection

Procedia PDF Downloads 310
7803 A Comparison of the Adsorption Mechanism of Arsenic on Iron-Modified Nanoclays

Authors: Michael Leo L. Dela Cruz, Khryslyn G. Arano, Eden May B. Dela Pena, Leslie Joy Diaz

Abstract:

Arsenic adsorbents were continuously being researched to ease the detrimental impact of arsenic to human health. A comparative study on the adsorption mechanism of arsenic on iron modified nanoclays was undertaken. Iron intercalated montmorillonite (Fe-MMT) and montmorillonite supported zero-valent iron (ZVI-MMT) were the adsorbents investigated in this study. Fe-MMT was produced through ion-exchange by replacing the sodium intercalated ions in montmorillonite with iron (III) ions. The iron (III) in Fe-MMT was later reduced to zero valent iron producing ZVI-MMT. Adsorption study was performed by batch technique. Obtained data were fitted to intra-particle diffusion, pseudo-first order, and pseudo-second-order models and the Elovich equation to determine the kinetics of adsorption. The adsorption of arsenic on Fe-MMT followed the intra-particle diffusion model with intra-particle rate constant of 0.27 mg/g-min0.5. Arsenic was found to be chemically bound on ZVI-MMT as suggested by the pseudo-second order and Elovich equation. The derived pseudo-second order rate constant was 0.0027 g/mg-min with initial adsorption rate computed from the Elovich equation was 113 mg/g-min.

Keywords: adsorption mechanism, arsenic, montmorillonite, zero valent iron

Procedia PDF Downloads 415
7802 Recycling Service Strategy by Considering Demand-Supply Interaction

Authors: Hui-Chieh Li

Abstract:

Circular economy promotes greater resource productivity and avoids pollution through greater recycling and re-use which bring benefits for both the environment and the economy. The concept is contrast to a linear economy which is ‘take, make, dispose’ model of production. A well-design reverse logistics service strategy could enhance the willingness of recycling of the users and reduce the related logistics cost as well as carbon emissions. Moreover, the recycle brings the manufacturers most advantages as it targets components for closed-loop reuse, essentially converting materials and components from worn-out product into inputs for new ones at right time and right place. This study considers demand-supply interaction, time-dependent recycle demand, time-dependent surplus value of recycled product and constructs models on recycle service strategy for the recyclable waste collector. A crucial factor in optimizing a recycle service strategy is consumer demand. The study considers the relationships between consumer demand towards recycle and product characteristics, surplus value and user behavior. The study proposes a recycle service strategy which differs significantly from the conventional and typical uniform service strategy. Periods with considerable demand and large surplus product value suggest frequent and short service cycle. The study explores how to determine a recycle service strategy for recyclable waste collector in terms of service cycle frequency and duration and vehicle type for all service cycles by considering surplus value of recycled product, time-dependent demand, transportation economies and demand-supply interaction. The recyclable waste collector is responsible for the collection of waste product for the manufacturer. The study also examines the impacts of utilization rate on the cost and profit in the context of different sizes of vehicles. The model applies mathematical programming methods and attempts to maximize the total profit of the distributor during the study period. This study applies the binary logit model, analytical model and mathematical programming methods to the problem. The model specifically explores how to determine a recycle service strategy for the recycler by considering product surplus value, time-dependent recycle demand, transportation economies and demand-supply interaction. The model applies mathematical programming methods and attempts to minimize the total logistics cost of the recycler and maximize the recycle benefits of the manufacturer during the study period. The study relaxes the constant demand assumption and examines how service strategy affects consumer demand towards waste recycling. Results of the study not only help understanding how the user demand for recycle service and product surplus value affects the logistics cost and manufacturer’s benefits, but also provide guidance such as award bonus and carbon emission regulations for the government.

Keywords: circular economy, consumer demand, product surplus value, recycle service strategy

Procedia PDF Downloads 392
7801 Study of Strontium Sorption onto Indian Bentonite

Authors: Pankaj Pathak, Susmita Sharma

Abstract:

Incessant industrial growth fulfill the energy demand of present day society, at the same time it produces huge amount of waste which could be hazardous or non-hazardous in nature. These wastes are coming out from different sources viz, nuclear power, thermal power, coal mines which contain different types of contaminants and one of the emergent contaminant is strontium, used in the present study. The isotope of strontium (Sr90) is radioactive in nature with half-life of 28.8 years and permissible limit of strontium in drinking water is 1.5 ppm. Above the permissible limit causes several types of diseases in human being. Therefore, safe disposal of strontium into ground becomes a biggest challenge for the researchers. In this context, bentonite is being used as an efficient material to retain strontium onto ground due to its specific physical, chemical and mineralogical properties which exhibits higher cation exchange capacity and specific surface area. These properties influence the interaction between strontium and bentonite, which is quantified by employing a parameter known as distribution coefficient. Batch test was conducted, and sorption isotherms were modelled at different interaction time. The pseudo first-order and pseudo second order kinetic models have been used to fit experimental data, which helps to determine the sorption rate and mechanism.

Keywords: bentonite, interaction time, sorption, strontium

Procedia PDF Downloads 305
7800 Design Systems and the Need for a Usability Method: Assessing the Fitness of Components and Interaction Patterns in Design Systems Using Atmosphere Methodology

Authors: Patrik Johansson, Selina Mardh

Abstract:

The present study proposes a usability test method, Atmosphere, to assess the fitness of components and interaction patterns of design systems. The method covers the user’s perception of the components of the system, the efficiency of the logic of the interaction patterns, perceived ease of use as well as the user’s understanding of the intended outcome of interactions. These aspects are assessed by combining measures of first impression, visual affordance and expectancy. The method was applied to a design system developed for the design of an electronic health record system. The study was conducted involving 15 healthcare personnel. It could be concluded that the Atmosphere method provides tangible data that enable human-computer interaction practitioners to analyze and categorize components and patterns based on perceived usability, success rate of identifying interactive components and success rate of understanding components and interaction patterns intended outcome.

Keywords: atomic design, atmosphere methodology, design system, expectancy testing, first impression testing, usability testing, visual affordance testing

Procedia PDF Downloads 180
7799 Efficacy Enhancement of Hydrophobic Antibiotics Employing Rhamnolipid as Biosurfactant

Authors: Abdurrahim A. Elouzi, Abdurrauf M. Gusbi, Ali M. Elgerbi

Abstract:

Antibiotic resistance has become a global public-health problem, thus it is imperative that new antibiotics continue to be developed. Major problems are being experienced in human medicine from antibiotic resistant bacteria. Moreover, no new chemical class of antibiotics has been introduced into medicine in the past two decades. The aim of the current study presents experimental results that evaluate the capability of bio surfactant rhamnolipid on enhancing the efficacy of hydrophobic antibiotics. Serial dilutions of azithromycin and clarithromycin were prepared. A bacterial suspension (approximately 5 X 105 CFU) from an overnight culture in MSM was inoculated into 20 ml sterile test tube each containing a serial 10-fold dilution of the test antibiotic(s) in broth with or without 200 mgL-1 rhamnolipid. The tubes were incubated for 24 h with vigorous shaking at 37°C. Antimicrobial activity in multiple antibiotic-resistant gram-negative bacteria pathogens and gram-positive bacteria were assessed using optical density technique. The results clearly demonstrated that the presence of rhamnolipid significantly improved the efficiency of both antibiotics. We hypothesized that the addition of rhamnolipid at low concentration, causes release of LPS which results in an increase in cell surface hydrophobicity. This allows increased association of cells with hydrophobic antibiotics resulting in increased cytotoxicity rates.

Keywords: hydrophobic antibiotics, biosurfactant, rhamnolipid, azithromycin, clarithromycin

Procedia PDF Downloads 516
7798 Re-Reading the Impossibility of Identity: Modeling Gender Pluralism in Curriculum and Instruction

Authors: A. K. O’Loughlin

Abstract:

Identity doesn’t exist in discrete categories as it is defined. Kevin Kumashiro reveals the phrase 'an impossibility of identity' in Troubling Education (2000), an investigation of the intersections of culture and gender and the impact of erasure for queer POC identity. This underscores the essentiality of an insider or an outsider identity and the appearance of 'contradiction' or impossibility of these identities. The contradictions between us as subject in our own stories and in the stories of others are often silenced. This silencing of complex, 'contradicting' identity has unmissable implications in the classroom; the developing student in question is done a serious disservice, from which they may never recover. There is no more important point of contact than the teacher, for willingness to encounter a developing person as they are, not as we already think they are, or 'know' them to be, or think they should be. To decide how to regard them based on our own unilateral identity and its associated exhortations and injunctions is, as Hannah Arendt writes in The Origins of Totalitarianism (1951), to sell off our ability to rise, human-like, to the challenge of investigating things as they are. A re-reading of Kumashiro’s impossibility of identity becomes possible through the investigation of pluralism. Identities become possible and un-paradoxical by the notion that contradictions are not problems that an individual is not unilateral, but plural. In this paper, we investigate how philosophies of pluralism can inform our understanding of impossibility of identity in classroom curriculum and pedagogy.

Keywords: identity, gender, culture, pluralism, education, philosophy of education, queer theory, philosophy of mind, adolescent development

Procedia PDF Downloads 299
7797 Use of Logistics for Demand Control in a Commercial Establishment in Rio De Janeiro, Brazil

Authors: Carlos Fontanillas

Abstract:

Brazil is going through a real revolution in the logistics area. It is increasingly common to find articles and news in this context, as companies begin to become aware that a good management of the areas that make up the logistics can bring excellent results in reducing costs and increasing productivity. With this, companies are investing more emphasis on reduced spending on storage and transport of their products to ensure competitiveness. The scope of this work is the analysis of the logistics of a restaurant and materials will be presented the best way to serve the customer, avoiding the interruption of production due to lack of materials; for it will be analyzed the supply chain in terms of acquisition costs, maintenance and service demand.

Keywords: ABC curve, logistic, productivity, supply chain

Procedia PDF Downloads 314
7796 Integration of Corporate Social Responsibility Criteria in Employee Variable Remuneration Plans

Authors: Jian Wu

Abstract:

Since a few years, some French companies have integrated CRS (corporate social responsibility) criteria in their variable remuneration plans to ‘restore a good working atmosphere’ and ‘preserve the natural environment’. These CSR criteria are based on concerns on environment protection, social aspects, and corporate governance. In June 2012, a report on this practice has been made jointly by ORSE (which means Observatory on CSR in French) and PricewaterhouseCoopers. Facing this initiative from the business world, we need to examine whether it has a real economic utility. We adopt a theoretical approach for our study. First, we examine the debate between the ‘orthodox’ point of view in economics and the CSR school of thought. The classical economic model asserts that in a capitalist economy, exists a certain ‘invisible hand’ which helps to resolve all problems. When companies seek to maximize their profits, they are also fulfilling, de facto, their duties towards society. As a result, the only social responsibility that firms should have is profit-searching while respecting the minimum legal requirement. However, the CSR school considers that, as long as the economy system is not perfect, there is no ‘invisible hand’ which can arrange all in a good order. This means that we cannot count on any ‘divine force’ which makes corporations responsible regarding to society. Something more needs to be done in addition to firms’ economic and legal obligations. Then, we reply on some financial theories and empirical evident to examine the sound foundation of CSR. Three theories developed in corporate governance can be used. Stakeholder theory tells us that corporations owe a duty to all of their stakeholders including stockholders, employees, clients, suppliers, government, environment, and society. Social contract theory tells us that there are some tacit ‘social contracts’ between a company and society itself. A firm has to respect these contracts if it does not want to be punished in the form of fine, resource constraints, or bad reputation. Legitime theory tells us that corporations have to ‘legitimize’ their actions toward society if they want to continue to operate in good conditions. As regards empirical results, we present a literature review on the relationship between the CSR performance and the financial performance of a firm. We note that, due to difficulties in defining these performances, this relationship remains still ambiguous despite numerous research works realized in the field. Finally, we are curious to know whether the integration of CSR criteria in variable remuneration plans – which is practiced so far in big companies – should be extended to other ones. After investigation, we note that two groups of firms have the greatest need. The first one involves industrial sectors whose activities have a direct impact on the environment, such as petroleum and transport companies. The second one involves companies which are under pressures in terms of return to deal with international competition.

Keywords: corporate social responsibility, corporate governance, variable remuneration, stakeholder theory

Procedia PDF Downloads 186
7795 Paraffin/Expanded Perlite Composite as a Novel Form-Stable Phase Change Material for Latent Heat Energy Storage

Authors: Awni Alkhazaleh

Abstract:

Latent heat storage using Phase Change Materials (PCMs) has attracted growing attention recently in the renewable energy utilization and building energy efficiency. Paraffin (PA) of low melting temperature, which is close to human comfort temperature in the range of 24-28 °C has been considered to be used in building applications. A form-stable composite Paraffin/Expanded perlite (PA-EP) has been prepared by retaining PA into porous particles of EP. DSC (Differential scanning calorimeter) is used to measure the thermal properties of PA in the form-stable composite with/without building materials. TGA (Thermal gravimetric analysis) shows that the composite is thermally stable. SEM (Scanning electron microscope) demonstrates that the layer structure of the EP particles is uniformly absorbed by PA. The mechanical properties in flexural mode have been discussed. The thermal energy storage performance has been evaluated using a small test room (100 mm ×100 mm ×100 mm) with thickness 10 mm. The flammability test of modified sample has been discussed using a cone calorimeter. The results confirm that the form-stable composite PA has the function of reducing building energy consumption.

Keywords: flammability, latent heat storage, paraffin, plasterboard

Procedia PDF Downloads 219
7794 Reliable and Error-Free Transmission through Multimode Polymer Optical Fibers in House Networks

Authors: Tariq Ahamad, Mohammed S. Al-Kahtani, Taisir Eldos

Abstract:

Optical communications technology has made enormous and steady progress for several decades, providing the key resource in our increasingly information-driven society and economy. Much of this progress has been in finding innovative ways to increase the data carrying capacity of a single optical fiber. In this research article we have explored basic issues in terms of security and reliability for secure and reliable information transfer through the fiber infrastructure. Conspicuously, one potentially enormous source of improvement has however been left untapped in these systems: fibers can easily support hundreds of spatial modes, but today’s commercial systems (single-mode or multi-mode) make no attempt to use these as parallel channels for independent signals. Bandwidth, performance, reliability, cost efficiency, resiliency, redundancy, and security are some of the demands placed on telecommunications today. Since its initial development, fiber optic systems have had the advantage of most of these requirements over copper-based and wireless telecommunications solutions. The largest obstacle preventing most businesses from implementing fiber optic systems was cost. With the recent advancements in fiber optic technology and the ever-growing demand for more bandwidth, the cost of installing and maintaining fiber optic systems has been reduced dramatically. With so many advantages, including cost efficiency, there will continue to be an increase of fiber optic systems replacing copper-based communications. This will also lead to an increase in the expertise and the technology needed to tap into fiber optic networks by intruders. As ever before, all technologies have been subject to hacking and criminal manipulation, fiber optics is no exception. Researching fiber optic security vulnerabilities suggests that not everyone who is responsible for their networks security is aware of the different methods that intruders use to hack virtually undetected into fiber optic cables. With millions of miles of fiber optic cables stretching across the globe and carrying information including but certainly not limited to government, military, and personal information, such as, medical records, banking information, driving records, and credit card information; being aware of fiber optic security vulnerabilities is essential and critical. Many articles and research still suggest that fiber optics is expensive, impractical and hard to tap. Others argue that it is not only easily done, but also inexpensive. This paper will briefly discuss the history of fiber optics, explain the basics of fiber optic technologies and then discuss the vulnerabilities in fiber optic systems and how they can be better protected. Knowing the security risks and knowing the options available may save a company a lot embarrassment, time, and most importantly money.

Keywords: in-house networks, fiber optics, security risk, money

Procedia PDF Downloads 420
7793 Depth-Averaged Velocity Distribution in Braided Channel Using Calibrating Coefficients

Authors: Spandan Sahu, Amiya Kumar Pati, Kishanjit Kumar Khatua

Abstract:

Rivers are the backbone of human civilization as well as one of the most important components of nature. In this paper, a method for predicting lateral depth-averaged velocity distribution in a two-flow braided compound channel is proposed. Experiments were conducted to study the boundary shear stress in the tip of the two flow path. The cross-section of the channel is divided into several panels to study the flow phenomenon on both the main channel and the flood plain. It can be inferred from the study that the flow coefficients get affected by boundary shear stress. In this study, the analytical solution of Shiono and knight (SKM) for lateral distributions of depth-averaged velocity and bed shear stress has been taken into account. The SKM is based on hydraulic parameters, which signify the bed friction factor (f), lateral eddy viscosity, and depth-averaged flow. While applying the SKM to different panels, the equations are solved considering the boundary conditions between panels. The boundary shear stress data, which are obtained from experimentation, are compared with CES software, which is based on quasi-one-dimensional Reynold's Averaged Navier-Stokes (RANS) approach.

Keywords: boundary shear stress, lateral depth-averaged velocity, two-flow braided compound channel, velocity distribution

Procedia PDF Downloads 129
7792 Predicting Susceptibility to Coronary Artery Disease using Single Nucleotide Polymorphisms with a Large-Scale Data Extraction from PubMed and Validation in an Asian Population Subset

Authors: K. H. Reeta, Bhavana Prasher, Mitali Mukerji, Dhwani Dholakia, Sangeeta Khanna, Archana Vats, Shivam Pandey, Sandeep Seth, Subir Kumar Maulik

Abstract:

Introduction Research has demonstrated a connection between coronary artery disease (CAD) and genetics. We did a deep literature mining using both bioinformatics and manual efforts to identify the susceptible polymorphisms in coronary artery disease. Further, the study sought to validate these findings in an Asian population. Methodology In first phase, we used an automated pipeline which organizes and presents structured information on SNPs, Population and Diseases. The information was obtained by applying Natural Language Processing (NLP) techniques to approximately 28 million PubMed abstracts. To accomplish this, we utilized Python scripts to extract and curate disease-related data, filter out false positives, and categorize them into 24 hierarchical groups using named Entity Recognition (NER) algorithms. From the extensive research conducted, a total of 466 unique PubMed Identifiers (PMIDs) and 694 Single Nucleotide Polymorphisms (SNPs) related to coronary artery disease (CAD) were identified. To refine the selection process, a thorough manual examination of all the studies was carried out. Specifically, SNPs that demonstrated susceptibility to CAD and exhibited a positive Odds Ratio (OR) were selected, and a final pool of 324 SNPs was compiled. The next phase involved validating the identified SNPs in DNA samples of 96 CAD patients and 37 healthy controls from Indian population using Global Screening Array. ResultsThe results exhibited out of 324, only 108 SNPs were expressed, further 4 SNPs showed significant difference of minor allele frequency in cases and controls. These were rs187238 of IL-18 gene, rs731236 of VDR gene, rs11556218 of IL16 gene and rs5882 of CETP gene. Prior researches have reported association of these SNPs with various pathways like endothelial damage, susceptibility of vitamin D receptor (VDR) polymorphisms, and reduction of HDL-cholesterol levels, ultimately leading to the development of CAD. Among these, only rs731236 had been studied in Indian population and that too in diabetes and vitamin D deficiency. For the first time, these SNPs were reported to be associated with CAD in Indian population. Conclusion: This pool of 324 SNP s is a unique kind of resource that can help to uncover risk associations in CAD. Here, we validated in Indian population. Further, validation in different populations may offer valuable insights and contribute to the development of a screening tool and may help in enabling the implementation of primary prevention strategies targeted at the vulnerable population.

Keywords: coronary artery disease, single nucleotide polymorphism, susceptible SNP, bioinformatics

Procedia PDF Downloads 76
7791 Use of Fruit Beetles, Waxworms Larvae and Tiger Worms in Waste Conditioning for Composting

Authors: Waleed S. Alwaneen

Abstract:

In many countries, cow dung is used as farm manure and for biogas production. Several bacterial strains associated with cow dung such as Campylobacter, Salmonella sp. and Escherichia coli cause serious human diseases. The objective of the present study was to investigate the use of insect larvae including fruit beetle, waxworms and tiger worms to improve the breakdown of agricultural wastes and reduce their pathogen loads. Fresh cow faeces were collected from a cattle farm and distributed into plastic boxes (100 g/box). Each box was provided with 10 larvae of fruit beetle, Waxworms and Tiger worms, respectively. There were 3 replicates in each treatment including the control. Bacteria were isolated weekly from both control and cow faeces to which larvae were added to determine the bacterial populations. Results revealed that the bacterial load was higher in the cow faeces treated with fruit beetles than in the control, while the bacterial load was lower in the cow faeces treated with waxworms and tiger worms than in the control. The activities of the fruit beetle larvae led to the cow faeces being liquefied which provided a more conducive growing media for bacteria. Therefore, higher bacterial load in the cow faeces treated with fruit beetle might be attributed to the liquefaction of cow faeces.

Keywords: fruit beetle, waxworms, tiger worms, waste conditioning, composting

Procedia PDF Downloads 250
7790 Diagnosis and Management of Obesity Among South Asians: A Paradigm

Authors: Deepa Vasudevan, Thomas Northrup, Angela Stotts, Michelle Klawans

Abstract:

To date, we have conducted three studies on this subject. The research done to date is through three studies. The initial study was to document that modified criteria independently identified higher numbers of overweight/obese South Asian Indians. The second study was to document physician knowledge of appropriate diagnosis of obesity among South Asian Indians. The final study was an intervention to evaluate the efficacy of a training module on improving physician diagnosis and counseling of overweight/obese Asian patients.

Keywords: South Asian Indians, obesity, physicians, BMI and waist circumference

Procedia PDF Downloads 407
7789 Content Based Video Retrieval System Using Principal Object Analysis

Authors: Van Thinh Bui, Anh Tuan Tran, Quoc Viet Ngo, The Bao Pham

Abstract:

Video retrieval is a searching problem on videos or clips based on content in which they are relatively close to an input image or video. The application of this retrieval consists of selecting video in a folder or recognizing a human in security camera. However, some recent approaches have been in challenging problem due to the diversity of video types, frame transitions and camera positions. Besides, that an appropriate measures is selected for the problem is a question. In order to overcome all obstacles, we propose a content-based video retrieval system in some main steps resulting in a good performance. From a main video, we process extracting keyframes and principal objects using Segmentation of Aggregating Superpixels (SAS) algorithm. After that, Speeded Up Robust Features (SURF) are selected from those principal objects. Then, the model “Bag-of-words” in accompanied by SVM classification are applied to obtain the retrieval result. Our system is performed on over 300 videos in diversity from music, history, movie, sports, and natural scene to TV program show. The performance is evaluated in promising comparison to the other approaches.

Keywords: video retrieval, principal objects, keyframe, segmentation of aggregating superpixels, speeded up robust features, bag-of-words, SVM

Procedia PDF Downloads 302
7788 Decreased Autophagy Contributes to Senescence Induction in HS68 Cells

Authors: Byeal-I Han, Michael Lee

Abstract:

Ageing is associated with an increased risk of diseases such as cancer, and neurodegenerative disorders. Increased autophagy delays ageing and extends longevity. In this study, we investigated the role of autophagy in longevity using human foreskin fibroblast HS68 cells, in which a senescence-like growth arrest can be induced. In particular, cellular senescence is manifested by the irreversible cell cycle arrest, and may contribute to the ageing of organisms. The senescence state was measured with staining for senescence-associated β-galactosidase (SA-β-gal) activity that represents a sensitive and reliable marker to quantify senescent cells. We detected a significantly increased percentage (%) of SA-β-gal positive cells in HS68 cultures at passage 40 (63%) when compared with younger ones at passage 15 (0.5%). As expected, HS68 cells at passage 40 exhibited much lower proliferation rate than cells at passage 15. The basal levels of LC3 were measured by immunoblotting showing a comparison of LC3-I and LC3-II levels at 3 age-points in serially passaged HS68 cells. LC3-II/LC3-I ratio at different passage levels relative to β-actin levels of each band confirmed that cells at passage 34 showed lower conversion of non-autophagic LC3-I to autophagic LC3-II than the cells at passage 16. Furthermore, Cyto-ID autophagy assay also revealed that late passage cells showed lower autophagy than the early passage cells. Together, our findings suggest that senescence induction might be associated with decreased autophagy.

Keywords: ageing, autophagy, senescence, HS68

Procedia PDF Downloads 255
7787 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 139
7786 Process Modeling in an Aeronautics Context

Authors: Sophie Lemoussu, Jean-Charles Chaudemar, Robertus A. Vingerhoeds

Abstract:

Many innovative projects exist in the field of aeronautics, each addressing specific areas so to reduce weight, increase autonomy, reduction of CO2, etc. In many cases, such innovative developments are being carried out by very small enterprises (VSE’s) or small and medium sized-enterprises (SME’s). A good example concerns airships that are being studied as a real alternative to passenger and cargo transportation. Today, no international regulations propose a precise and sufficiently detailed framework for the development and certification of airships. The absence of such a regulatory framework requires a very close contact with regulatory instances. However, VSE’s/SME’s do not always have sufficient resources and internal knowledge to handle this complexity and to discuss these issues. This poses an additional challenge for those VSE’s/SME’s, in particular those that have system integration responsibilities and that must provide all the necessary evidence to demonstrate their ability to design, produce, and operate airships with the expected level of safety and reliability. The main objective of this research is to provide a methodological framework enabling VSE’s/SME’s with limited resources to organize the development of airships while taking into account the constraints of safety, cost, time and performance. This paper proposes to provide a contribution to this problematic by proposing a Model-Based Systems Engineering approach. Through a comprehensive process modeling approach applied to the development processes, the regulatory constraints, existing best practices, etc., a good image can be obtained as to the process landscape that may influence the development of airships. To this effect, not only the necessary regulatory information is taken on board, also other international standards and norms on systems engineering and project management are being modeled and taken into account. In a next step, the model can be used for analysis of the specific situation for given developments, derive critical paths for the development, identify eventual conflicting aspects between the norms, standards, and regulatory expectations, or also identify those areas where not enough information is available. Once critical paths are known, optimization approaches can be used and decision support techniques can be applied so to better support VSE’s/SME’s in their innovative developments. This paper reports on the adopted modeling approach, the retained modeling languages, and how they all fit together.

Keywords: aeronautics, certification, process modeling, project management, regulation, SME, systems engineering, VSE

Procedia PDF Downloads 161
7785 Impact of Intelligent Transportation System on Planning, Operation and Safety of Urban Corridor

Authors: Sourabh Jain, S. S. Jain

Abstract:

Intelligent transportation system (ITS) is the application of technologies for developing a user–friendly transportation system to extend the safety and efficiency of urban transportation systems in developing countries. These systems involve vehicles, drivers, passengers, road operators, managers of transport services; all interacting with each other and the surroundings to boost the security and capacity of road systems. The goal of urban corridor management using ITS in road transport is to achieve improvements in mobility, safety, and the productivity of the transportation system within the available facilities through the integrated application of advanced monitoring, communications, computer, display, and control process technologies, both in the vehicle and on the road. Intelligent transportation system is a product of the revolution in information and communications technologies that is the hallmark of the digital age. The basic ITS technology is oriented on three main directions: communications, information, integration. Information acquisition (collection), processing, integration, and sorting are the basic activities of ITS. In the paper, attempts have been made to present the endeavor that was made to interpret and evaluate the performance of the 27.4 Km long study corridor having eight intersections and four flyovers. The corridor consisting of six lanes as well as eight lanes divided road network. Two categories of data have been collected such as traffic data (traffic volume, spot speed, delay) and road characteristics data (no. of lanes, lane width, bus stops, mid-block sections, intersections, flyovers). The instruments used for collecting the data were video camera, stop watch, radar gun, and mobile GPS (GPS tracker lite). From the analysis, the performance interpretations incorporated were the identification of peak and off-peak hours, congestion and level of service (LOS) at midblock sections and delay followed by plotting the speed contours. The paper proposed the urban corridor management strategies based on sensors integrated into both vehicles and on the roads that those have to be efficiently executable, cost-effective, and familiar to road users. It will be useful to reduce congestion, fuel consumption, and pollution so as to provide comfort, safety, and efficiency to the users.

Keywords: ITS strategies, congestion, planning, mobility, safety

Procedia PDF Downloads 179
7784 Evaluating the Feasibility of Chemical Dermal Exposure Assessment Model

Authors: P. S. Hsi, Y. F. Wang, Y. F. Ho, P. C. Hung

Abstract:

The aim of the present study was to explore the dermal exposure assessment model of chemicals that have been developed abroad and to evaluate the feasibility of chemical dermal exposure assessment model for manufacturing industry in Taiwan. We conducted and analyzed six semi-quantitative risk management tools, including UK - Control of substances hazardous to health ( COSHH ) Europe – Risk assessment of occupational dermal exposure ( RISKOFDERM ), Netherlands - Dose related effect assessment model ( DREAM ), Netherlands – Stoffenmanager ( STOFFEN ), Nicaragua-Dermal exposure ranking method ( DERM ) and USA / Canada - Public Health Engineering Department ( PHED ). Five types of manufacturing industry were selected to evaluate. The Monte Carlo simulation was used to analyze the sensitivity of each factor, and the correlation between the assessment results of each semi-quantitative model and the exposure factors used in the model was analyzed to understand the important evaluation indicators of the dermal exposure assessment model. To assess the effectiveness of the semi-quantitative assessment models, this study also conduct quantitative dermal exposure results using prediction model and verify the correlation via Pearson's test. Results show that COSHH was unable to determine the strength of its decision factor because the results evaluated at all industries belong to the same risk level. In the DERM model, it can be found that the transmission process, the exposed area, and the clothing protection factor are all positively correlated. In the STOFFEN model, the fugitive, operation, near-field concentrations, the far-field concentration, and the operating time and frequency have a positive correlation. There is a positive correlation between skin exposure, work relative time, and working environment in the DREAM model. In the RISKOFDERM model, the actual exposure situation and exposure time have a positive correlation. We also found high correlation with the DERM and RISKOFDERM models, with coefficient coefficients of 0.92 and 0.93 (p<0.05), respectively. The STOFFEN and DREAM models have poor correlation, the coefficients are 0.24 and 0.29 (p>0.05), respectively. According to the results, both the DERM and RISKOFDERM models are suitable for performance in these selected manufacturing industries. However, considering the small sample size evaluated in this study, more categories of industries should be evaluated to reduce its uncertainty and enhance its applicability in the future.

Keywords: dermal exposure, risk management, quantitative estimation, feasibility evaluation

Procedia PDF Downloads 169
7783 Empirical Evaluation of Game Components Based on Learning Theory: A Preliminary Study

Authors: Seoi Lee, Dongjoo Chin, Heewon Kim

Abstract:

Gamification refers to a technique that applies game elements to non-gaming elements, such as education and exercise, to make people more engaged in these behaviors. The purpose of this study was to identify effective elements in gamification for changing human behaviors. In order to accomplish this purpose, a survey based on learning theory was developed, especially for assessing antecedents and consequences of behaviors, and 8 popular and 8 unpopular games were selected for comparison. A total of 407 adult males and females were recruited via crowdsourcing Internet marketplace and completed the survey, which consisted of 19 questions for antecedent and 14 questions for consequences. Results showed no significant differences in consequence questions between popular and unpopular games. For antecedent questions, popular games are superior to unpopular games in character customization, play type selection, a sense of belonging, patch update cycle, and influence or dominance. This study is significant in that it reveals the elements of gamification based on learning theory. Future studies need to empirically validate whether these factors affect behavioral change.

Keywords: gamification, learning theory, antecedent, consequence, behavior change, behaviorism

Procedia PDF Downloads 223
7782 Functional Poly(Hedral Oligomeric Silsesquioxane) Nano-Spacer to Boost Quantum Resistive Vapour Sensors’ Sensitivity and Selectivity

Authors: Jean-Francois Feller

Abstract:

The analysis of the volatolome emitted by the human body with a sensor array (e-nose) is a method for clinical applications full of promises to make an olfactive fingerprint characteristic of people's health state. But the amount of volatile organic compounds (VOC) to detect, being in the range of parts per billion (ppb), and their diversity (several hundred) justifies developing ever more sensitive and selective vapor sensors to improve the discrimination ability of the e-nose, is still of interest. Quantum resistive vapour sensors (vQRS) made with nanostructured conductive polymer nanocomposite transducers have shown a great versatility in both their fabrication and operation to detect volatiles of interest such as cancer biomarkers. However, it has been shown that their chemo-resistive response was highly dependent on the quality of the inter-particular junctions in the percolated architecture. The present work investigates the effectiveness of poly(hedral oligomeric silsesquioxane) acting as a nanospacer to amplify the disconnectability of the conducting network and thus maximize the vQRS's sensitivity to VOC.

Keywords: volatolome, quantum resistive vapour sensor, nanostructured conductive polymer nanocomposites, olfactive diagnosis

Procedia PDF Downloads 22
7781 The Role of Dialogue in Shared Leadership and Team Innovative Behavior Relationship

Authors: Ander Pomposo

Abstract:

Purpose: The aim of this study was to investigate the impact that dialogue has on the relationship between shared leadership and innovative behavior and the importance of dialogue in innovation. This study wants to contribute to the literature by providing theorists and researchers a better understanding of how to move forward in the studies of moderator variables in the relationship between shared leadership and team outcomes such as innovation. Methodology: A systematic review of the literature, originally adopted from the medical sciences but also used in management and leadership studies, was conducted to synthesize research in a systematic, transparent and reproducible manner. A final sample of 48 empirical studies was scientifically synthesized. Findings: Shared leadership gives a better solution to team management challenges and goes beyond the classical, hierarchical, or vertical leadership models based on the individual leader approach. One of the outcomes that emerge from shared leadership is team innovative behavior. To intensify the relationship between shared leadership and team innovative behavior, and understand when is more effective, the moderating effects of other variables in this relationship should be examined. This synthesis of the empirical studies revealed that dialogue is a moderator variable that has an impact on the relationship between shared leadership and team innovative behavior when leadership is understood as a relational process. Dialogue is an activity between at least two speech partners trying to fulfill a collective goal and is a way of living open to people and ideas through interaction. Dialogue is productive when team members engage relationally with one another. When this happens, participants are more likely to take responsibility for the tasks they are involved and for the relationships they have with others. In this relational engagement, participants are likely to establish high-quality connections with a high degree of generativity. This study suggests that organizations should facilitate the dialogue of team members in shared leadership which has a positive impact on innovation and offers a more adaptive framework for the leadership that is needed in teams working in complex work tasks. These results uncover the necessity of more research on the role that dialogue plays in contributing to important organizational outcomes such as innovation. Case studies describing both best practices and obstacles of dialogue in team innovative behavior are necessary to gain a more detailed insight into the field. It will be interesting to see how all these fields of research evolve and are implemented in dialogue practices in the organizations that use team-based structures to deal with uncertainty, fast-changing environments, globalization and increasingly complex work.

Keywords: dialogue, innovation, leadership, shared leadership, team innovative behavior

Procedia PDF Downloads 182
7780 Study of the Mega–Landslide at the Community of Ropoto, Central Greece, and of the Design of Mitigation and Early Warning System Using the Fiber Bragg Grating Technology

Authors: Michael Bellas, George Voulgaridis

Abstract:

This paper refers to the world known mega - landslide induced at the community of Ropoto, belonging to the Municipality of Trikala, in the Central part of Greece. The landslide affected the debris as well as the colluvium mantle of the flysch, and makes up a special case of study in engineering geology and geotechnical engineering not only because of the size of the domain affected by the landslide (approximately 750m long), but also because of the geostructure’s global behavior. Due to the landslide, the whole community’s infrastructure massively collapsed and human lives were put in danger. After the complete simulation of the coupled Seepage - Deformation phenomenon due to the extreme rainfall, and by closely examining the slope’s global behavior, both the mitigation of the landslide, as well as, an advanced surveillance method (Fiber Bragg Grating) using fiber optics were further studied, in order both to retain the geostructure and to monitor its health by creating an early warning system, which would serve as a complete safety net for saving both the community’s infrastructure as well as the lives of its habitats.

Keywords: landslide, remediation measures, the finite element method (FEM), Fiber Bragg Grating (FBG) sensing method

Procedia PDF Downloads 329