Search results for: starting points
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3433

Search results for: starting points

2293 Identification of Soft Faults in Branched Wire Networks by Distributed Reflectometry and Multi-Objective Genetic Algorithm

Authors: Soumaya Sallem, Marc Olivas

Abstract:

This contribution presents a method for detecting, locating, and characterizing soft faults in a complex wired network. The proposed method is based on multi-carrier reflectometry MCTDR (Multi-Carrier Time Domain Reflectometry) combined with a multi-objective genetic algorithm. In order to ensure complete network coverage and eliminate diagnosis ambiguities, the MCTDR test signal is injected at several points on the network, and the data is merged between different reflectometers (sensors) distributed on the network. An adapted multi-objective genetic algorithm is used to merge data in order to obtain more accurate faults location and characterization. The proposed method performances are evaluated from numerical and experimental results.

Keywords: wired network, reflectometry, network distributed diagnosis, multi-objective genetic algorithm

Procedia PDF Downloads 175
2292 The Role of Metallic Mordant in Natural Dyeing Process: Experimental and Quantum Study on Color Fastness

Authors: Bo-Gaun Chen, Chiung-Hui Huang, Mei-Ching Chiang, Kuo-Hsing Lee, Chia-Chen Ho, Chin-Ping Huang, Chin-Heng Tien

Abstract:

It is known that the natural dyeing of cloth results moderate color, but with poor color fastness. This study points out the correlation between the macroscopic color fastness of natural dye to the cotton fiber and the microscopic binding energy of dye molecule to the cellulose. With the additive metallic mordant, the new-formed coordination bond bridges the dye to the fiber surface and thus affects the color fastness as well as the color appearance. The density functional theory (DFT) calculation is therefore used to explore the most possible mechanism during the dyeing process. Finally, the experimental results reflect the strong effect of three different metal ions on the natural dyeing clothes.

Keywords: binding energy, color fastness, density functional theory (DFT), natural dyeing, metallic mordant

Procedia PDF Downloads 532
2291 The Investigation of Cracking on the Shell of Dryers (tag No. 2DR-1745 and DR-1402) in Shahid Tondguyan Petrochemical Company (STPC)

Authors: Ali Haghiri

Abstract:

This research has been to investigate the cause of the stress corrosion cracking on dryer equipment (2DR-1745 and DR-1402) in Shahid Tondguyan Petrochemical Company (STPC). These dryers are as a drying powder Terphetalic acid in CTA2 and CTA1 unit. After passing through RVF equipment, wet cake moisture content of about 14% and temperature of 90C changed into a dry cake with a moisture content of less than 0.1% and the final temperature of about 140C and sent out Final Silo (FS-1820). After the declaration of the operation department concerning the observation of acid leakage under the primary insulation was decided that at the first opportunity, this issue must be investigated. So, after the shutdown of a unit at the date 2012/10/20 (2DR-1745) and 2021/11/24 (DR-1402) and after washing the dryer wall, insulation around the wall opened and it was found to crack and leakage from some points.

Keywords: stress corrosion cracking, residual stress, austenitic stainless steel, Br- ion

Procedia PDF Downloads 141
2290 Economic Analysis of an Integrated Anaerobic Digestion and Ozonolysis System

Authors: Tshilenge Kabongo, John Kabuba

Abstract:

The distillery wastewater has become major issues in sanitation sectors. One of the solutions to overcome this sewage is to install the Wastewater Treatment Plant. Economic analysis is fundamentally required for its viability. Integrated anaerobic digestion and advanced oxidation (AD-AOP) in the treatment of distillery wastewater (DWW), anaerobic digestion achieved sufficient biochemical oxygen demand (BOD) and chemical oxygen demand (COD) removals of 95% and 75%, respectively, and methane production of 0.292 L/g COD removed at an organic loading rate of 15 kg COD/m3/d. However, a considerable amount of biorecalcitrant compounds still existed in the anaerobically treated effluent, contributing to a residual COD of 4.5 g/L and an intense dark brown color. To remove the biorecalcitrant color and COD, ozonation, which is an AOP, was introduced as a post-treatment method to AD. Ozonation is a highly competitive treatment technique that can be easily applied to remove the biorecalcitrant compounds, including color, and turbidity. In the ozonation process carried out for an hour, more than 80% of the color was removed at an ozone dose of 45 mg O3/L/min (corresponding to 1.8 g O3/g COD). Thus, integrating AD with the AOP can be effective for organic load and color reductions during the treatment of DWW. The deliverable established the best configuration of the AD-AOP system, where DWW is first subjected to AD followed by AOP post-treatment. However, for establishing the feasibility of the industrial application of the integrated system, it is necessary to carry out the economic analysis. This may help the starting point of the wastewater treatment plant construction and its operation and maintenance costs.

Keywords: distillery wastewater, economic analysis, integrated anaerobic digestion, ozonolysis, treatment

Procedia PDF Downloads 112
2289 Demographic Dividend and Creation of Human and Knowledge Capital in Liberal India: An Endogenous Growth Process

Authors: Arjun K., Arumugam Sankaran, Sanjay Kumar, Mousumi Das

Abstract:

The paper analyses the existence of endogenous growth scenario emanating from the demographic dividend in India during the liberalization period starting from 1980. Demographic dividend creates a fertile ground for the cultivation of human and knowledge capitals contributing to technological progress which can be measured using total factor productivity. The relationship among total factor productivity, human and knowledge capitals are examined in an open endogenous framework for the period 1980-2016. The control variables such as foreign direct investment, trade openness, energy consumption are also employed. The data are sourced from Reserve Bank of India, World Bank, International Energy Agency and The National Science and Technology Management Information System. To understand the dynamic association among variables, ARDL bounds approach to cointegration followed by Toda-Yamamoto causality test are used. The results reveal a short run and long run relationship among the variables supported by the existence of causality. This calls for an integrated policy to build and augment human capital and research and development activities to sustain and pace up growth and development in the nation.

Keywords: demographic dividend, young population, open endogenous growth models, human and knowledge capital

Procedia PDF Downloads 130
2288 Efficient Internal Generator Based on Random Selection of an Elliptic Curve

Authors: Mustapha Benssalah, Mustapha Djeddou, Karim Drouiche

Abstract:

The random number generation (RNG) presents a significant importance for the security and the privacy of numerous applications, such as RFID technology and smart cards. Since, the quality of the generated bit sequences is paramount that a weak internal generator for example, can directly cause the entire application to be insecure, and thus it makes no sense to employ strong algorithms for the application. In this paper, we propose a new pseudo random number generator (PRNG), suitable for cryptosystems ECC-based, constructed by randomly selecting points from several elliptic curves randomly selected. The main contribution of this work is the increasing of the generator internal states by extending the set of its output realizations to several curves auto-selected. The quality and the statistical characteristics of the proposed PRNG are validated using the Chi-square goodness of fit test and the empirical Special Publication 800-22 statistical test suite issued by NIST.

Keywords: PRNG, security, cryptosystem, ECC

Procedia PDF Downloads 428
2287 Localization Mobile Beacon Using RSSI

Authors: Sallama Resen, Celal Öztürk

Abstract:

Distance estimation between tow nodes has wide scope of surveillance and tracking applications. This paper suggests a Bluetooth Low Energy (BLE) technology as a media for transceiver and receiver signal in small indoor areas. As an example, BLE communication technologies used in child safety domains. Local network is designed to detect child position in indoor school area consisting Mobile Beacons (MB), Access Points (AP) and Smart Phones (SP) where MBs stuck in children’s shoes as wearable sensors. This paper presents a technique that can detect mobile beacons’ position and help finding children’s location within dynamic environment. By means of bluetooth beacons that are attached to child’s shoes, the distance between the MB and teachers SP is estimated with an accuracy of less than one meter. From the simulation results, it is shown that high accuracy of position coordinates are achieved for multi-mobile beacons in different environments.

Keywords: bluetooth low energy, child safety, mobile beacons, received signal strength

Procedia PDF Downloads 328
2286 Homeopathic Approach in a Dog with Idiopathic Epilepsy - Case Report

Authors: Barbosa M. L. S., von Ancken A. C. B., Coelho C. P.

Abstract:

In order to improve the treatment of epileptic dogs, this case report aims toobjective todescribe the use of the homeopathic medicine Cicuta virosa for the treatmentof seizuresin dogs that already use allopathy to control them. Howeach patient presents symptoms individually, the choice of medicationhomeopathic treatment must also be individualized. He was treated in the municipality of RibeirãoPires, São Paulo - Brazil, an animal of the canine species, female, 7 years old, SRD, with a history of seizuregeneralized tonic-clonic for two years, with a variable frequency of 1-2 seizures perday. With no identifiable etiology, the patient used phenobarbital daily, and the dose ofmedication was increased according to the frequency of seizures. The serum concentration of phenobarbital within 12 hours of itsadministration via blood sample was within the range ofreference. The patient experienced weight gain and intermittent sedation. the choice ofhomeopathic medicine Cicuta virosa 6 cH, prepared according to the PharmacopoeiaBrazilian Homeopathic Medicine, occurred due to its characteristic action on the nervous system, especially in epileptic animals that present with seizures, spasmodic contractions of the muscles of the whole body starting from the head, mouth, extremely violent, with rigidity and opisthotonos, extreme agitation, contortionsmultiple. The animal was submitted to treatment with 2 globules orally twicea day for 30 days. The treatment resulted in a clinical cure as there was no moreseizures, being effective to control this symptom.

Keywords: homeopathy, cicuta virosa, epilepsy, veterinary medicine

Procedia PDF Downloads 92
2285 External Validation of Established Pre-Operative Scoring Systems in Predicting Response to Microvascular Decompression for Trigeminal Neuralgia

Authors: Kantha Siddhanth Gujjari, Shaani Singhal, Robert Andrew Danks, Adrian Praeger

Abstract:

Background: Trigeminal neuralgia (TN) is a heterogenous pain syndrome characterised by short paroxysms of lancinating facial pain in the distribution of the trigeminal nerve, often triggered by usually innocuous stimuli. TN has a low prevalence of less than 0.1%, of which 80% to 90% is caused by compression of the trigeminal nerve from an adjacent artery or vein. The root entry zone of the trigeminal nerve is most sensitive to neurovascular conflict (NVC), causing dysmyelination. Whilst microvascular decompression (MVD) is an effective treatment for TN with NVC, all patients do not achieve long-term pain relief. Pre-operative scoring systems by Panczykowski and Hardaway have been proposed but have not been externally validated. These pre-operative scoring systems are composite scores calculated according to a subtype of TN, presence and degree of neurovascular conflict, and response to medical treatments. There is discordance in the assessment of NVC identified on pre-operative magnetic resonance imaging (MRI) between neurosurgeons and radiologists. To our best knowledge, the prognostic impact for MVD of this difference of interpretation has not previously been investigated in the form of a composite scoring system such as those suggested by Panczykowski and Hardaway. Aims: This study aims to identify prognostic factors and externally validate the proposed scoring systems by Panczykowski and Hardaway for TN. A secondary aim is to investigate the prognostic difference between a neurosurgeon's interpretation of NVC on MRI compared with a radiologist’s. Methods: This retrospective cohort study included 95 patients who underwent de novo MVD in a single neurosurgical unit in Melbourne. Data was recorded from patients’ hospital records and neurosurgeon’s correspondence from perioperative clinic reviews. Patient demographics, type of TN, distribution of TN, response to carbamazepine, neurosurgeon, and radiologist interpretation of NVC on MRI, were clearly described prospectively and preoperatively in the correspondence. Scoring systems published by Panczykowski et al. and Hardaway et al. were used to determine composite scores, which were compared with the recurrence of TN recorded during follow-up over 1-year. Categorical data analysed using Pearson chi-square testing. Independent numerical and nominal data analysed with logistical regression. Results: Logistical regression showed that a Panczykowski composite score of greater than 3 points was associated with a higher likelihood of pain-free outcome 1-year post-MVD with an OR 1.81 (95%CI 1.41-2.61, p=0.032). The composite score using neurosurgeon’s impression of NVC had an OR 2.96 (95%CI 2.28-3.31, p=0.048). A Hardaway composite score of greater than 2 points was associated with a higher likelihood of pain-free outcome 1 year post-MVD with an OR 3.41 (95%CI 2.58-4.37, p=0.028). The composite score using neurosurgeon’s impression of NVC had an OR 3.96 (95%CI 3.01-4.65, p=0.042). Conclusion: Composite scores developed by Panczykowski and Hardaway were validated for the prediction of response to MVD in TN. A composite score based on the neurosurgeon’s interpretation of NVC on MRI, when compared with the radiologist’s had a greater correlation with pain-free outcomes 1 year post-MVD.

Keywords: de novo microvascular decompression, neurovascular conflict, prognosis, trigeminal neuralgia

Procedia PDF Downloads 58
2284 Instrumentation for Engine Start Cycle Characterization at Cold Weather High Altitude Condition

Authors: Amit Kumar Gupta, Rohit Vashistha, G. P. Ravishankar, Mahesh P. Padwale

Abstract:

A cold soaked gas turbine engine have known starting problems in high altitude and low temperature conditions. The high altitude results in lower ambient temperature, pressure, and density. Soaking at low temperature leads to higher oil viscosity, increasing the engine starter system torque requirement. Also, low temperature soaks results in a cold compressor rotor and casing. Since the thermal mass of rotor is higher than casing, casing expands faster, thereby, increasing the blade-casing tip clearance. The low pressure flow over the compressor blade coupled with the secondary flow through the compressor tip clearance during start result in stall inception. The present study discusses engine instrumentation required for capturing the stall inception event. The engine fan exit and combustion chamber were instrumented with dynamic pressure probes to capture the pressure characteristic and clamp-on current meter on primary igniter cable to capture ignition event during start cycle. The experiment was carried out at 10500 Ft. pressure altitude and -15°C ambient temperature. The high pressure compressor stall events were recorded during the starts.

Keywords: compressor inlet, dynamic pressure probe, engine start cycle, flight test instrumentation

Procedia PDF Downloads 301
2283 Precision Assessment of the Orthometric Heights Determination in the Northern Part of Libya

Authors: Jamal A. Gledan, Akrm H. Algnin

Abstract:

The Global Positioning System (GPS) satellite-based technology has been utilized extensively in the last few years in a wide range of Geomatics and Geographic Information Systems (GIS) applications. One of the main challenges dealing with GPS-based heights consists of converting them into Mean Sea Level (MSL) heights which is used in surveys and mapping. In this research work, differences in heights of 50 points, in northern part of Libya were carried out using both ordinary levelling (in which Geoid is the reference datum) and GPS techniques (in which Ellipsoid is the reference datum). In addition, this study has utilized the EGM2008 model to obtain the undulation values between the ellipsoidal and orthometric heights. From these values with ellipsoidal heights which can be obtained from GPS observations to compute the orthomteric heights. This research presented a suitable alternative, from an economical point of view, to substitute the expensive traditional levelling technique particularly for topographic mapping.

Keywords: geoid undulation, GPS, ordinary and geodetic levelling, orthometric height

Procedia PDF Downloads 419
2282 Determination of Steel Cleanliness of Non-Grain Oriented Electrical Steels

Authors: Emre Alan, Zafer Cetin

Abstract:

Electrical steels are widely used as a magnetic core materials in many electrical applications such as transformers, electric motors, and generators. Core loss property of these magnetic materials refers to dissipation of electrical energy during magnetization in service conditions. Therefore, in order to minimize the magnetic core loss, certain precautions are taken from steel producers; “Steel Cleanliness” is one of the major points among them. For obtaining lower core loss values, increasing proper elements in chemical composition such as silicon is a must. Therefore, impurities of these alloys are a key value for producing a cleaner steel. In this study, effects of impurity levels of different FeSi alloying materials to the steel cleanliness will be investigated. One of the important element content in FeSi alloy materials is Calcium. A SEM investigation will be done in order to present if Ca content in FeSi alloy is enough for proper inclusion modification or an additional Ca-treatment is required.

Keywords: electrical steels, FeSi alloy, impurities, steel cleanliness

Procedia PDF Downloads 318
2281 Fire Safety Engineering of Wood Dust Layer or Cloud

Authors: Marzena Półka, Bożena Kukfisz

Abstract:

This paper presents an analysis of dust explosion hazards in the process industries. It includes selected testing method of dust explosibility and presentation two of them according to experimental standards used by Department of Combustion and Fire Theory in The Main School of Fire Service in Warsaw. In the article are presented values of maximum acceptable surface temperature (MAST) of machines operating in the presence of dust cloud and chosen dust layer with thickness of 5 and 12,5mm. The comparative analysis, points to the conclusion that the value of the minimum ignition temperature of the layer (MITL) and the minimum ignition temperature of dust cloud (MTCD) depends on the granularity of the substance. Increasing the thickness of the dust layer reduces minimum ignition temperature of dust layer. Increasing the thickness of dust at the same time extends the flameless combustion and delays the ignition.

Keywords: fire safety engineering, industrial hazards, minimum ignition temperature, wood dust

Procedia PDF Downloads 297
2280 Temperature Control Improvement of Membrane Reactor

Authors: Pornsiri Kaewpradit, Chalisa Pourneaw

Abstract:

Temperature control improvement of a membrane reactor with exothermic and reversible esterification reaction is studied in this work. It is well known that a batch membrane reactor requires different control strategies from a continuous one due to the fact that it is operated dynamically. Due to the effect of the operating temperature, the suitable control scheme has to be designed based reliable predictive model to achieve a desired objective. In the study, the optimization framework has been preliminary formulated in order to determine an optimal temperature trajectory for maximizing a desired product. In model predictive control scheme, a set of predictive models have been initially developed corresponding to the possible operating points of the system. The multiple predictive control moves have been further calculated on-line using the developed models corresponding to current operating point. It is obviously seen in the simulation results that the temperature control has been improved compared to the performance obtained by the conventional predictive controller. Further robustness tests have also been investigated in this study.

Keywords: model predictive control, batch reactor, temperature control, membrane reactor

Procedia PDF Downloads 449
2279 A Statistical Study on Young UAE Driver’s Behavior towards Road Safety

Authors: Sadia Afroza, Rakiba Rouf

Abstract:

Road safety and associated behaviors have received significant attention in recent years, reflecting general public concern. This paper portrays a statistical scenario of the young drivers in UAE with emphasis on various concern points of young driver’s behavior and license issuance. Although there are many factors contributing to road accidents, statistically it is evident that age plays a major role in road accidents. Despite ensuring strict road safety laws enforced by the UAE government, there is a staggering correlation among road accidents and young driver’s at UAE. However, private organizations like BMW and RoadSafetyUAE have extended its support on conducting surveys on driver’s behavior with an aim to ensure road safety. Various strategies such as road safety law enforcement, license issuance, adapting new technologies like safety cameras and raising awareness can be implemented to improve the road safety concerns among young drivers.

Keywords: driving behavior, Graduated Driver Licensing System (GLDS), road safety, UAE drivers, young drivers

Procedia PDF Downloads 237
2278 Computational Team Dynamics and Interaction Patterns in New Product Development Teams

Authors: Shankaran Sitarama

Abstract:

New Product Development (NPD) is invariably a team effort and involves effective teamwork. NPD team has members from different disciplines coming together and working through the different phases all the way from conceptual design phase till the production and product roll out. Creativity and Innovation are some of the key factors of successful NPD. Team members going through the different phases of NPD interact and work closely yet challenge each other during the design phases to brainstorm on ideas and later converge to work together. These two traits require the teams to have a divergent and a convergent thinking simultaneously. There needs to be a good balance. The team dynamics invariably result in conflicts among team members. While some amount of conflict (ideational conflict) is desirable in NPD teams to be creative as a group, relational conflicts (or discords among members) could be detrimental to teamwork. Team communication truly reflect these tensions and team dynamics. In this research, team communication (emails) between the members of the NPD teams is considered for analysis. The email communication is processed through a semantic analysis algorithm (LSA) to analyze the content of communication and a semantic similarity analysis to arrive at a social network graph that depicts the communication amongst team members based on the content of communication. The amount of communication (content and not frequency of communication) defines the interaction strength between the members. Social network adjacency matrix is thus obtained for the team. Standard social network analysis techniques based on the Adjacency Matrix (AM) and Dichotomized Adjacency Matrix (DAM) based on network density yield network graphs and network metrics like centrality. The social network graphs are then rendered for visual representation using a Metric Multi-Dimensional Scaling (MMDS) algorithm for node placements and arcs connecting the nodes (representing team members) are drawn. The distance of the nodes in the placement represents the tie-strength between the members. Stronger tie-strengths render nodes closer. Overall visual representation of the social network graph provides a clear picture of the team’s interactions. This research reveals four distinct patterns of team interaction that are clearly identifiable in the visual representation of the social network graph and have a clearly defined computational scheme. The four computational patterns of team interaction defined are Central Member Pattern (CMP), Subgroup and Aloof member Pattern (SAP), Isolate Member Pattern (IMP), and Pendant Member Pattern (PMP). Each of these patterns has a team dynamics implication in terms of the conflict level in the team. For instance, Isolate member pattern, clearly points to a near break-down in communication with the member and hence a possible high conflict level, whereas the subgroup or aloof member pattern points to a non-uniform information flow in the team and some moderate level of conflict. These pattern classifications of teams are then compared and correlated to the real level of conflict in the teams as indicated by the team members through an elaborate self-evaluation, team reflection, feedback form and results show a good correlation.

Keywords: team dynamics, team communication, team interactions, social network analysis, sna, new product development, latent semantic analysis, LSA, NPD teams

Procedia PDF Downloads 51
2277 Discovering New Organic Materials through Computational Methods

Authors: Lucas Viani, Benedetta Mennucci, Soo Young Park, Johannes Gierschner

Abstract:

Organic semiconductors have attracted the attention of the scientific community in the past decades due to their unique physicochemical properties, allowing new designs and alternative device fabrication methods. Until today, organic electronic devices are largely based on conjugated polymers mainly due to their easy processability. In the recent years, due to moderate ET and CT efficiencies and the ill-defined nature of polymeric systems the focus has been shifting to small conjugated molecules with well-defined chemical structure, easier control of intermolecular packing, and enhanced CT and ET properties. It has led to the synthesis of new small molecules, followed by the growth of their crystalline structure and ultimately by the device preparation. This workflow is commonly followed without a clear knowledge of the ET and CT properties related mainly to the macroscopic systems, which may lead to financial and time losses, since not all materials will deliver the properties and efficiencies demanded by the current standards. In this work, we present a theoretical workflow designed to predict the key properties of ET of these new materials prior synthesis, thus speeding up the discovery of new promising materials. It is based on quantum mechanical, hybrid, and classical methodologies, starting from a single molecule structure, finishing with the prediction of its packing structure, and prediction of properties of interest such as static and averaged excitonic couplings, and exciton diffusion length.

Keywords: organic semiconductor, organic crystals, energy transport, excitonic couplings

Procedia PDF Downloads 238
2276 The Economic Impact of State Paid Family Leave and Medical Acts on Working Families with Old and Disabled Adults

Authors: Ngoc Dao

Abstract:

State Paid Leave Programs (PFL) complement the Federal Family and Medical Leave Act (FMLA) by offering workers time off to take care of their newborns or sick family members with supplemental income, and further job protection. Up to date, four states (California, New Jersey, Rhode Island, and New York) implemented paid leave policies. This study adds further understanding of how state PFL policies help working families with elder parents improve their work balance by examining the paid leave policies on labor outcomes. Early findings suggest State Paid Leave Policies reduced the likelihood to exit the labor market by 1.6 percentage points, with larger effects among paid leave policies with job protection feature. In addition, the results imply job protection in paid leave policies matters in helping employed caregivers attach to the labor market.

Keywords: family paid leave, working caregivers, employment, social welfare

Procedia PDF Downloads 109
2275 The Effectiveness of Anti-Smoking Campaign towards Young Adults (A Case Study in Bandar Sunway Institution)

Authors: Intan Abida Abu Bakar

Abstract:

This paper investigates the effectiveness of anti-smoking campaign towards youth in Bandar Sunway institution. Based from the Ministry of Health, Malaysia and the national newspapers in the country reveal that the campaigns were not effective enough to curb smoking in Malaysia. In the past, from the year 2004 to 2014, the Malaysian Health Ministry were determined to curb the smoking issue that were arising in the country especially among the youths. “Tak Nak” smoking campaign was launched and broadcast on all forms of media in Malaysia. The campaigns are to educate and create an awareness to encourage people to quit smoking besides discourage non-smokers from starting to smoke. The main objective of this research is to investigate and study the concept, storyline and appeal of ‘Tak Nak Merokok’ advertisement campaigns from 2004 to 2014. Data from questionnaires and focus group discussions indicate that the advertisement contained fear and emotional appeal with good concept and storyline are more appealing and effective compared to the humour and informational rational appeal. This research could be a guideline for advertisers who want to come up with creative anti-smoking campaigns in Malaysia. In the future, the focus group can be expanded and more feedbacks and reviews could contribute to marketers and advertisers to determine the most suitable advertisements to tackle this smoking issue.

Keywords: effectiveness, anti-smoking campaign, young adults, smoking

Procedia PDF Downloads 237
2274 A Saltwater Battery Inspired by the Membrane Potential Found in Biological Cells

Authors: Ross Lee, Pritpal Singh, Andrew Jester

Abstract:

As the world transitions to a more sustainable energy economy, the deployment of energy storage technologies is expected to increase to develop a more resilient grid system. However, current technologies are associated with various environmental and safety issues throughout their entire lifecycle; therefore, new battery technology is necessary for grid applications to curtail these risks. Biological cells, such as human neurons and electrolytes in the electric eel, can serve as a more sustainable design template for a new bio-inspired (i.e., biomimetic) battery. Within biological cells, an electrochemical gradient across the cell membrane forms the membrane potential, which serves as the driving force for ion transport into/out of the cell, akin to the charging/discharging of a battery cell. This work serves as the first step to developing such a biomimetic battery cell, starting with the fabrication and characterization of ion-selective membranes to facilitate ion transport through the cell. Performance characteristics (e.g., cell voltage, power density, specific energy, roundtrip efficiency) for the cell under investigation are compared to incumbent battery technologies and biological cells to assess the readiness level for this emerging technology. Using a Na⁺-Form Nafion-117 membrane, the cell in this work successfully demonstrated behavior similar to human neurons; these findings will inform how cell components can be re-engineered to enhance device performance.

Keywords: battery, biomimetic, electrolytes, human neurons, ion-selective membranes, membrane potential

Procedia PDF Downloads 96
2273 Requirements Management in Agile

Authors: Ravneet Kaur

Abstract:

The concept of Agile Requirements Engineering and Management is not new. However, the struggle to figure out how traditional Requirements Management Process fits within an Agile framework remains complex. This paper talks about a process that can merge the organization’s traditional Requirements Management Process nicely into the Agile Software Development Process. This process provides Traceability of the Product Backlog to the external documents on one hand and User Stories on the other hand. It also gives sufficient evidence that the system will deliver the right functionality with good quality in the form of various statistics and reports. In the nutshell, by overlaying a process on top of Agile, without disturbing the Agility, we are able to get synergic benefits in terms of productivity, profitability, its reporting, and end to end visibility to all Stakeholders. The framework can be used for just-in-time requirements definition or to build a repository of requirements for future use. The goal is to make sure that the business (specifically, the product owner) can clearly articulate what needs to be built and define what is of high quality. To accomplish this, the requirements cycle follows a Scrum-like process that mirrors the development cycle but stays two to three steps ahead. The goal is to create a process by which requirements can be thoroughly vetted, organized, and communicated in a manner that is iterative, timely, and quality-focused. Agile is quickly becoming the most popular way of developing software because it fosters continuous improvement, time-boxed development cycles, and more quickly delivering value to the end users. That value will be driven to a large extent by the quality and clarity of requirements that feed the software development process. An agile, lean, and timely approach to requirements as the starting point will help to ensure that the process is optimized.

Keywords: requirements management, Agile

Procedia PDF Downloads 352
2272 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.

Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance

Procedia PDF Downloads 386
2271 A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm

Authors: Ali Nourollah, Mohsen Movahedinejad

Abstract:

In this paper a new algorithm to generate random simple polygons from a given set of points in a two dimensional plane is designed. The proposed algorithm uses a genetic algorithm to generate polygons with few vertices. A new merge algorithm is presented which converts any two polygons into a simple polygon. This algorithm at first changes two polygons into a polygonal chain and then the polygonal chain is converted into a simple polygon. The process of converting a polygonal chain into a simple polygon is based on the removal of intersecting edges. The merge algorithm has the time complexity of O ((r+s) *l) where r and s are the size of merging polygons and l shows the number of intersecting edges removed from the polygonal chain. It will be shown that 1 < l < r+s. The experiments results show that the proposed algorithm has the ability to generate a great number of different simple polygons and has better performance in comparison to celebrated algorithms such as space partitioning and steady growth.

Keywords: Divide and conquer, genetic algorithm, merge polygons, Random simple polygon generation.

Procedia PDF Downloads 515
2270 A Study of Welfare State and Indian Democracy by Exploration of Social Welfare Programmes in India

Authors: Kuldeep Singh

Abstract:

The present paper is an attempt for tracing the changes in the welfare state in Indian democracy from the starting point till now and aims to critical analyse the social-welfare programmes in India with respect to welfare state. After getting independence from Britishers, India became a welfare state and is aiming towards the upliftment of its citizens. Indian democracy is considered to be the largest amongst democratic countries, instead of this after forty-five years of independence, Panchayati Raj Institution became one of the branches of democratic decentralization institutions in India by 73rd and 74th Constitutional Amendment in 1992. Unfortunately, desired purpose of introducing Panchayati Raj Institution is not achieved after all these delayed efforts. The basic problem regarding achievement of welfare state in India in true sense is unawareness and non-implementation of these social-welfare programmes. Presently, Indian government is only focusing on economic growth of the country but lacking from the social point. The doctrinal method of research is used in this research paper. In the concluding remarks, researcher is partly favoring the government in introducing welfare programmes as there are abundant of welfare schemes and programmes, but majority are facing implementation problem. In last, researcher has suggested regarding programmes and schemes that these should be qualitative in nature and power would be given to effective machinery for further check upon their proper implementation and aware the citizens regarding their rights so that welfare state would be achieved.

Keywords: democratic decentralization, Indian democracy, Panchayati Raj institution, social-welfare programmes, welfare state

Procedia PDF Downloads 144
2269 Nonlinear Evolution on Graphs

Authors: Benniche Omar

Abstract:

We are concerned with abstract fully nonlinear differential equations having the form y’(t)=Ay(t)+f(t,y(t)) where A is an m—dissipative operator (possibly multi—valued) defined on a subset D(A) of a Banach space X with values in X and f is a given function defined on I×X with values in X. We consider a graph K in I×X. We recall that K is said to be viable with respect to the above abstract differential equation if for each initial data in K there exists at least one trajectory starting from that initial data and remaining in K at least for a short time. The viability problem has been studied by many authors by using various techniques and frames. If K is closed, it is shown that a tangency condition, which is mainly linked to the dynamic, is crucial for viability. In the case when X is infinite dimensional, compactness and convexity assumptions are needed. In this paper, we are concerned with the notion of near viability for a given graph K with respect to y’(t)=Ay(t)+f(t,y(t)). Roughly speaking, the graph K is said to be near viable with respect to y’(t)=Ay(t)+f(t,y(t)), if for each initial data in K there exists at least one trajectory remaining arbitrary close to K at least for short time. It is interesting to note that the near viability is equivalent to an appropriate tangency condition under mild assumptions on the dynamic. Adding natural convexity and compactness assumptions on the dynamic, we may recover the (exact) viability. Here we investigate near viability for a graph K in I×X with respect to y’(t)=Ay(t)+f(t,y(t)) where A and f are as above. We emphasis that the t—dependence on the perturbation f leads us to introduce a new tangency concept. In the base of a tangency conditions expressed in terms of that tangency concept, we formulate criteria for K to be near viable with respect to y’(t)=Ay(t)+f(t,y(t)). As application, an abstract null—controllability theorem is given.

Keywords: abstract differential equation, graph, tangency condition, viability

Procedia PDF Downloads 126
2268 Discretization of Cuckoo Optimization Algorithm for Solving Quadratic Assignment Problems

Authors: Elham Kazemi

Abstract:

Quadratic Assignment Problem (QAP) is one the combinatorial optimization problems about which research has been done in many companies for allocating some facilities to some locations. The issue of particular importance in this process is the costs of this allocation and the attempt in this problem is to minimize this group of costs. Since the QAP’s are from NP-hard problem, they cannot be solved by exact solution methods. Cuckoo Optimization Algorithm is a Meta-heuristicmethod which has higher capability to find the global optimal points. It is an algorithm which is basically raised to search a continuous space. The Quadratic Assignment Problem is the issue which can be solved in the discrete space, thus the standard arithmetic operators of Cuckoo Optimization Algorithm need to be redefined on the discrete space in order to apply the Cuckoo Optimization Algorithm on the discrete searching space. This paper represents the way of discretizing the Cuckoo optimization algorithm for solving the quadratic assignment problem.

Keywords: Quadratic Assignment Problem (QAP), Discrete Cuckoo Optimization Algorithm (DCOA), meta-heuristic algorithms, optimization algorithms

Procedia PDF Downloads 492
2267 Identification of Functional T Cell Receptors Reactive to Tumor Antigens from the T Cell Repertoire of Healthy Donors

Authors: Isaac Quiros-Fernandez, Angel Cid-Arregui

Abstract:

Tumor-reactive T cell receptors (TCRs) are being subject of intense investigation since they offer great potential in adoptive cell therapies against cancer. However, the identification of tumor-specific TCRs has proven challenging, for instance, due to the limited expansion capacity of tumor-infiltrating T cells (TILs) and the extremely low frequencies of tumor-reactive T cells in the repertoire of patients and healthy donors. We have developed an approach for rapid identification and characterization of neoepitope-reactive TCRs from the T cell repertoire of healthy donors. CD8 T cells isolated from multiple donors are subjected to a first sorting step after staining with HLA multimers carrying the peptide of interest. The isolated cells are expanded for two weeks, after which a second sorting is performed using the same peptide-HLA multimers. The cells isolated in this way are then processed for single-cell sequencing of their TCR alpha and beta chains. Newly identified TCRs are cloned in appropriate expression vectors for functional analysis on Jurkat, NK92, and primary CD8 T cells and tumor cells expressing the appropriate antigen. We have identified TCRs specifically binding HLA-A2 presenting epitopes of tumor antigens, which are capable of inducing TCR-mediated cell activation and cytotoxicity in target cancer cell lines. This method allows the identification of tumor-reactive TCRs in about two to three weeks, starting from peripheral blood samples of readily available healthy donors.

Keywords: cancer, TCR, tumor antigens, immunotherapy

Procedia PDF Downloads 49
2266 Evolving Knowledge Extraction from Online Resources

Authors: Zhibo Xiao, Tharini Nayanika de Silva, Kezhi Mao

Abstract:

In this paper, we present an evolving knowledge extraction system named AKEOS (Automatic Knowledge Extraction from Online Sources). AKEOS consists of two modules, including a one-time learning module and an evolving learning module. The one-time learning module takes in user input query, and automatically harvests knowledge from online unstructured resources in an unsupervised way. The output of the one-time learning is a structured vector representing the harvested knowledge. The evolving learning module automatically schedules and performs repeated one-time learning to extract the newest information and track the development of an event. In addition, the evolving learning module summarizes the knowledge learned at different time points to produce a final knowledge vector about the event. With the evolving learning, we are able to visualize the key information of the event, discover the trends, and track the development of an event.

Keywords: evolving learning, knowledge extraction, knowledge graph, text mining

Procedia PDF Downloads 442
2265 Survival Chances and Costs after Heart Attacks: An Instrumental Variable Approach

Authors: Alice Sanwald, Thomas Schober

Abstract:

We analyze mortality and follow-up costs of heart attack patients using administrative data from Austria (2002-2011). As treatment intensity in a hospital largely depends on whether it has a catheterization laboratory, we focus on the effects of patients' initial admission to these specialized hospitals. To account for the nonrandom selection of patients into hospitals, we exploit individuals' place of residence as a source of exogenous variation in an instrumental variable framework. We find that the initial admission to specialized hospitals increases patients' survival chances substantially. The effect on 3-year mortality is -9.5 percentage points. A separation of the sample into subgroups shows the strongest effects in relative terms for patients below the age of 65. We do not find significant effects on longterm inpatient costs and find only marginal increases in outpatient costs.

Keywords: acute myocardial infarction, mortality, costs, instrumental variables, heart attack

Procedia PDF Downloads 414
2264 Loss Quantification Archaeological Sites in Watershed Due to the Use and Occupation of Land

Authors: Elissandro Voigt Beier, Cristiano Poleto

Abstract:

The main objective of the research is to assess the loss through the quantification of material culture (archaeological fragments) in rural areas, sites explored economically by machining on seasonal crops, and also permanent, in a hydrographic subsystem Camaquã River in the state of Rio Grande do Sul, Brazil. The study area consists of different micro basins and differs in area, ranging between 1,000 m² and 10,000 m², respectively the largest and the smallest, all with a large number of occurrences and outcrop locations of archaeological material and high density in intense farm environment. In the first stage of the research aimed to identify the dispersion of points of archaeological material through field survey through plot points by the Global Positioning System (GPS), within each river basin, was made use of concise bibliography on the topic in the region, helping theoretically in understanding the old landscaping with preferences of occupation for reasons of ancient historical people through the settlements relating to the practice observed in the field. The mapping was followed by the cartographic development in the region through the development of cartographic products of the land elevation, consequently were created cartographic products were to contribute to the understanding of the distribution of the absolute materials; the definition and scope of the material dispersed; and as a result of human activities the development of revolving letter by mechanization of in situ material, it was also necessary for the preparation of materials found density maps, linking natural environments conducive to ancient historical occupation with the current human occupation. The third stage of the project it is for the systematic collection of archaeological material without alteration or interference in the subsurface of the indigenous settlements, thus, the material was prepared and treated in the laboratory to remove soil excesses, cleaning through previous communication methodology, measurement and quantification. Approximately 15,000 were identified archaeological fragments belonging to different periods of ancient history of the region, all collected outside of its environmental and historical context and it also has quite changed and modified. The material was identified and cataloged considering features such as object weight, size, type of material (lithic, ceramic, bone, Historical porcelain and their true association with the ancient history) and it was disregarded its principles as individual lithology of the object and functionality same. As observed preliminary results, we can point out the change of materials by heavy mechanization and consequent soil disturbance processes, and these processes generate loading of archaeological materials. Therefore, as a next step will be sought, an estimate of potential losses through a mathematical model. It is expected by this process, to reach a reliable model of high accuracy which can be applied to an archeological site of lower density without encountering a significant error.

Keywords: degradation of heritage, quantification in archaeology, watershed, use and occupation of land

Procedia PDF Downloads 255