Search results for: MDR ICF core sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3132

Search results for: MDR ICF core sets

2202 Identification of Nonlinear Systems Using Radial Basis Function Neural Network

Authors: C. Pislaru, A. Shebani

Abstract:

This paper uses the radial basis function neural network (RBFNN) for system identification of nonlinear systems. Five nonlinear systems are used to examine the activity of RBFNN in system modeling of nonlinear systems; the five nonlinear systems are dual tank system, single tank system, DC motor system, and two academic models. The feed forward method is considered in this work for modelling the non-linear dynamic models, where the K-Means clustering algorithm used in this paper to select the centers of radial basis function network, because it is reliable, offers fast convergence and can handle large data sets. The least mean square method is used to adjust the weights to the output layer, and Euclidean distance method used to measure the width of the Gaussian function.

Keywords: system identification, nonlinear systems, neural networks, radial basis function, K-means clustering algorithm

Procedia PDF Downloads 469
2201 Soliton Interaction in Multi-Core Optical Fiber: Application to WDM System

Authors: S. Arun Prakash, V. Malathi, M. S. Mani Rajan

Abstract:

The analytical bright two soliton solution of the 3-coupled nonlinear Schrödinger equations with variable coefficients in birefringent optical fiber is obtained by Darboux transformation method. To the design of ultra-speed optical devices, Soliton interaction and control in birefringence fiber is investigated. Lax pair is constructed for N coupled NLS system through AKNS method. Using two soliton solution, we demonstrate different interaction behaviors of solitons in birefringent fiber depending on the choice of control parameters. Our results shows that interactions of optical solitons have some specific applications such as construction of logic gates, optical computing, soliton switching, and soliton amplification in wavelength division multiplexing (WDM) system.

Keywords: optical soliton, soliton interaction, soliton switching, WDM

Procedia PDF Downloads 503
2200 Optimization of Three-Layer Corrugated Metal Gasket by Using Finite Element Method

Authors: I Made Gatot Karohika, Shigeyuki Haruyama, Ken Kaminishi

Abstract:

In this study, we proposed a three-layer metal gasket with Al, Cu, and SUS304 as the material, respectively. A finite element method was employed to develop simulation solution and design of experiment (DOE). Taguchi method was used to analysis the effect of each parameter design and predicts optimal design of new 25A-size three layer corrugated metal gasket. The L18 orthogonal array of Taguchi method was applied to design experiment matrix for eight factors with three levels. Based on elastic mode and plastic mode, optimum design gasket is gasket with core metal SUS304, surface layer aluminum, p1 = 4.5 mm, p2 = 4.5 mm, p3 = 4 mm, Tg = 1.2 mm, R = 3.5 mm, h = 0.4 mm and Ts = 0.3 mm.

Keywords: contact width, contact stress, layer, metal gasket, corrugated, simulation

Procedia PDF Downloads 314
2199 The Interaction and Relations Between Civil and Military Logistics

Authors: Cumhur Cansever, Selcuk Er

Abstract:

There is an increasing cooperation and interaction between the military logistic systems and civil organizations operating in today's market. While the scope and functions of civilian logistics have different characteristics, military logistics tries to import some applications that are conducted by private sectors successfully. Also, at this point, the determination of the optimal point of integration and interaction between civilian and military logistics has emerged as a key issue. In this study, the mutual effects between military and civilian logistics and their most common integration areas, (Supply Chain Management (SCM), Integrated Logistics Support (ILS) and Outsourcing) will be examined with risk analysis and determination of basic skills evaluation methods for determining the optimum point in the integration.

Keywords: core competency, integrated logistics support, outsourcing, supply chain management

Procedia PDF Downloads 525
2198 Seafloor and Sea Surface Modelling in the East Coast Region of North America

Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk

Abstract:

Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.

Keywords: seafloor, sea surface height, bathymetry, satellite altimetry

Procedia PDF Downloads 78
2197 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models

Authors: Tony Mann

Abstract:

This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.

Keywords: facilitation, stakeholders, buy-in, digital workshops

Procedia PDF Downloads 107
2196 Trait of Sales Professionals

Authors: Yuichi Morita, Yoshiteru Nakamori

Abstract:

In car dealer business of Japan, a sale professional is a key factor of company’s success. We hypothesize that, if a corporation knows what is the sales professionals’ trait of its corporation’s business field, it will be easier for a corporation to secure and nurture sales persons effectively. The lean human resources management will ensure business success and good performance of corporations, especially small and medium ones. The goal of the paper is to determine the traits of sales professionals for small-and medium-size car dealers, using chi-square test and the variable rough set model. As a result, the results illustrate that experience of job change, learning ability and product knowledge are important, and an academic background, building a career with internal transfer, experience of the leader and self-development are not important to be a sale professional. Also, we illustrate sales professionals’ traits are persistence, humility, improvisation and passion at business.

Keywords: traits of sales professionals, variable precision rough sets theory, sales professional, sales professionals

Procedia PDF Downloads 381
2195 Effectiveness Factor for Non-Catalytic Gas-Solid Pyrolysis Reaction for Biomass Pellet Under Power Law Kinetics

Authors: Haseen Siddiqui, Sanjay M. Mahajani

Abstract:

Various important reactions in chemical and metallurgical industries fall in the category of gas-solid reactions. These reactions can be categorized as catalytic and non-catalytic gas-solid reactions. In gas-solid reaction systems, heat and mass transfer limitations put an appreciable influence on the rate of the reaction. The consequences can be unavoidable for overlooking such effects while collecting the reaction rate data for the design of the reactor. Pyrolysis reaction comes in this category that involves the production of gases due to the interaction of heat and solid substance. Pyrolysis is also an important step in the gasification process and therefore, the gasification reactivity majorly influenced by the pyrolysis process that produces the char, as a feed for the gasification process. Therefore, in the present study, a non-isothermal transient 1-D model is developed for a single biomass pellet to investigate the effect of heat and mass transfer limitations on the rate of pyrolysis reaction. The obtained set of partial differential equations are firstly discretized using the concept of ‘method of lines’ to obtain a set of ordinary differential equation with respect to time. These equations are solved, then, using MATLAB ode solver ode15s. The model is capable of incorporating structural changes, porosity variation, variation in various thermal properties and various pellet shapes. The model is used to analyze the effectiveness factor for different values of Lewis number and heat of reaction (G factor). Lewis number includes the effect of thermal conductivity of the solid pellet. Higher the Lewis number, the higher will be the thermal conductivity of the solid. The effectiveness factor was found to be decreasing with decreasing Lewis number due to the fact that smaller Lewis numbers retard the rate of heat transfer inside the pellet owing to a lower rate of pyrolysis reaction. G factor includes the effect of the heat of reaction. Since the pyrolysis reaction is endothermic in nature, the G factor takes negative values. The more the negative value higher will be endothermic nature of the pyrolysis reaction. The effectiveness factor was found to be decreasing with more negative values of the G factor. This behavior can be attributed to the fact that more negative value of G factor would result in more energy consumption by the reaction owing to a larger temperature gradient inside the pellet. Further, the analytical expressions are also derived for gas and solid concentrations and effectiveness factor for two limiting cases of the general model developed. The two limiting cases of the model are categorized as the homogeneous model and unreacted shrinking core model.

Keywords: effectiveness factor, G-factor, homogeneous model, lewis number, non-catalytic, shrinking core model

Procedia PDF Downloads 136
2194 Mechanical Properties of Kenaf Reinforced Composite with Different Fiber Orientation

Authors: Y. C. Ching, K. H. Chong

Abstract:

The increasing of environmental awareness has led to grow interest in the expansion of materials with eco-friendly attributes. In this study, a 3 ply sandwich layer of kenaf fiber reinforced unsaturated polyester with various fiber orientations was developed. The effect of the fiber orientation on mechanical and thermal stability properties of polyester was studied. Unsaturated polyester as a face sheets and kenaf fibers as a core was fabricated with combination of hand lay-up process and cold compression method. Tested result parameters like tensile, flexural, impact strength, melting point, and crystallization point were compared and recorded based on different fiber orientation. The failure mechanism and property changes associated with directional change of fiber to polyester composite were discussed.

Keywords: kenaf fiber, polyester, tensile, thermal stability

Procedia PDF Downloads 356
2193 Ensemble-Based SVM Classification Approach for miRNA Prediction

Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam

Abstract:

In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.

Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data

Procedia PDF Downloads 347
2192 Preference Aggregation and Mechanism Design in the Smart Grid

Authors: Zaid Jamal Saeed Almahmoud

Abstract:

Smart Grid is the vision of the future power system that combines advanced monitoring and communication technologies to provide energy in a smart, efficient, and user-friendly manner. This proposal considers a demand response model in the Smart Grid based on utility maximization. Given a set of consumers with conflicting preferences in terms of consumption and a utility company that aims to minimize the peak demand and match demand to supply, we study the problem of aggregating these preferences while modelling the problem as a game. We also investigate whether an equilibrium can be reached to maximize the social benefit. Based on such equilibrium, we propose a dynamic pricing heuristic that computes the equilibrium and sets the prices accordingly. The developed approach was analysed theoretically and evaluated experimentally using real appliances data. The results show that our proposed approach achieves a substantial reduction in the overall energy consumption.

Keywords: heuristics, smart grid, aggregation, mechanism design, equilibrium

Procedia PDF Downloads 110
2191 Time of Death Determination in Medicolegal Death Investigations

Authors: Michelle Rippy

Abstract:

Medicolegal death investigation historically is a field that does not receive much research attention or advancement, as all of the subjects are deceased. Public health threats, drug epidemics and contagious diseases are typically recognized in decedents first, with thorough and accurate death investigations able to assist in epidemiology research and prevention programs. One vital component of medicolegal death investigation is determining the decedent’s time of death. An accurate time of death can assist in corroborating alibies, determining sequence of death in multiple casualty circumstances and provide vital facts in civil situations. Popular television portrays an unrealistic forensic ability to provide the exact time of death to the minute for someone found deceased with no witnesses present. The actuality of unattended decedent time of death determination can generally only be narrowed to a 4-6 hour window. In the mid- to late-20th century, liver temperatures were an invasive action taken by death investigators to determine the decedent’s core temperature. The core temperature was programmed into an equation to determine an approximate time of death. Due to many inconsistencies with the placement of the thermometer and other variables, the accuracy of the liver temperatures was dispelled and this once common place action lost scientific support. Currently, medicolegal death investigators utilize three major after death or post-mortem changes at a death scene. Many factors are considered in the subjective determination as to the time of death, including the cooling of the decedent, stiffness of the muscles, release of blood internally, clothing, ambient temperature, disease and recent exercise. Current research is utilizing non-invasive hospital grade tympanic thermometers to measure the temperature in the each of the decedent’s ears. This tool can be used at the scene and in conjunction with scene indicators may provide a more accurate time of death. The research is significant and important to investigations and can provide an area of accuracy to a historically inaccurate area, considerably improving criminal and civil death investigations. The goal of the research is to provide a scientific basis to unwitnessed deaths, instead of the art that the determination currently is. The research is currently in progress with expected termination in December 2018. There are currently 15 completed case studies with vital information including the ambient temperature, decedent height/weight/sex/age, layers of clothing, found position, if medical intervention occurred and if the death was witnessed. This data will be analyzed with the multiple variables studied and available for presentation in January 2019.

Keywords: algor mortis, forensic pathology, investigations, medicolegal, time of death, tympanic

Procedia PDF Downloads 118
2190 Construction Time - Cost Trade-Off Analysis Using Fuzzy Set Theory

Authors: V. S. S. Kumar, B. Vikram, G. C. S. Reddy

Abstract:

Time and cost are the two critical objectives of construction project management and are not independent but intricately related. Trade-off between project duration and cost are extensively discussed during project scheduling because of practical relevance. Generally when the project duration is compressed, the project calls for an increase in labor and more productive equipments, which increases the cost. Thus, the construction time-cost optimization is defined as a process to identify suitable construction activities for speeding up to attain the best possible savings in both time and cost. As there is hidden tradeoff relationship between project time and cost, it might be difficult to predict whether the total cost would increase or decrease as a result of compressing the schedule. Different combinations of duration and cost for the activities associated with the project determine the best set in the time-cost optimization. Therefore, the contractors need to select the best combination of time and cost to perform each activity, all of which will ultimately determine the project duration and cost. In this paper, the fuzzy set theory is used to model the uncertainties in the project environment for time-cost trade off analysis.

Keywords: fuzzy sets, uncertainty, qualitative factors, decision making

Procedia PDF Downloads 649
2189 Geochemical Controls of Salinity in a Typical Acid Mine Drainage Neutralized Groundwater System

Authors: Modreck Gomo

Abstract:

Although the dolomite and calcite carbonates can neutralize Acid Mine Drainage (AMD) and prevent leaching of metals, salinity still remains a huge problem. The study presents a conceptual discussion of geochemical controls of salinity in a typical calcite and dolomite AMD neutralised groundwater systems. Thereafter field evidence is presented to support the conceptual discussions. 1020 field data sets of from a groundwater system reported to be under circumneutral conditions from the neutralization effect of calcite and dolomite is analysed using correlation analysis and bivariate plots. Field evidence indicates that sulphate, calcium and magnesium are strongly and positively correlated to Total Dissolved Solids (TDS) which is used as measure of salinity. In this, a hydrogeochemical system, the dissolution of sulphate, calcium and magnesium form AMD neutralization process contributed 50%, 10% and 5% of the salinity.

Keywords: acid mine drainage, carbonates, neutralization, salinity

Procedia PDF Downloads 140
2188 The Columbine Shooting in German Media Coverage: A Point of No Return

Authors: Melanie Verhovnik

Abstract:

School shootings are a well-known phenomenon in Germany, 14 of which have occurred to date. The first case happened half a year after the April 20th, 1999 Columbine shooting in the United States, which was at the time the most serious school shooting to have occurred anywhere in the world. The German media gave only scant attention to the subject of school shootings prior to Columbine, even though there were numerous instances of it throughout the world and several serious instances in the United States during the 1990s. A mixed method design of qualitative and quantitative content analysis was employed in order to demonstrate the main features and characteristics of core German media’s coverage of Columbine.

Keywords: Columbine, media coverage, qualitative, quantitative content analysis, school shooting

Procedia PDF Downloads 309
2187 An Introduction to Critical Chain Project Management Methodology

Authors: Ranjini Ramanath, Nanjunda P. Swamy

Abstract:

Construction has existed in our lives since time immemorial. However, unlike any other industry, construction projects have their own unique challenges – project type, purpose and end use of the project, geographical conditions, logistic arrangements, largely unorganized manpower and requirement of diverse skill sets, etc. These unique characteristics bring in their own level of risk and uncertainties to the project, which cause the project to deviate from its planned objectives of time, cost, quality, etc. over the many years, there have been significant developments in the way construction projects are conceptualized, planned, and managed. With the rapid increase in the population, increased rate of urbanization, there is a growing demand for infrastructure development, and it is required that the projects are delivered timely, and efficiently. In an age where ‘Time is Money,' implementation of new techniques of project management is required in leading to successful projects. This paper proposes a different approach to project management, which if applied in construction projects, can help in the accomplishment of the project objectives in a faster manner.

Keywords: critical chain project management methodology, critical chain, project management, construction management

Procedia PDF Downloads 421
2186 Hospital Evacuation: Best Practice Recommendations

Authors: Ronald Blough

Abstract:

Hospitals, clinics, and medical facilities are the core of the Health Services sector providing 24/7 medical care to those in need. Any disruption of these important medical services highlights the vulnerabilities in the medical system. An internal or external event can create a catastrophic incident paralyzing the medical services causing the facility to shift into emergency operations with the possibility of evacuation. The hospital administrator and government officials must decide in a very short amount of time whether to shelter in place or evacuate. This presentation will identify best practice recommendations regarding the hospital evacuation decision and response analyzing previous hospital evacuations to encourage hospitals in the region to review or develop their own emergency evacuation plans.

Keywords: disaster preparedness, hospital evacuation, shelter-in-place, incident containment, health services vulnerability, hospital resources

Procedia PDF Downloads 366
2185 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: dropwise condensation, textured surface, image processing, watershed

Procedia PDF Downloads 221
2184 Implication of E-Robot Kit in Kuwait’s Robotics Technology Learning and Innovation

Authors: Murtaza Hassan Sheikh, Ahmed A. A. AlSaleh, Naser H. N. Jasem

Abstract:

Kuwait has not yet made its mark in the world of technology and research. Therefore, advancements have been made to fill in this gap. Since Robotics covers a wide variety of fields and helps innovation, efforts have been made to promote its education. Despite of the efforts made in Kuwait, robotics education is still on hold. The paper discusses the issues and obstacles in the implementation of robotics education in Kuwait and how a robotics kit “E-Robot” is making an impact in the Kuwait’s future education and innovation. Problems such as robotics competitions rather than education, complexity of robot programming and lack of organized open source platform are being addressed by the introduction of the E-Robot Kit in Kuwait. Due to its success since 2012 a total of 15 schools have accepted the Kit as a core subject, with 200 teaching it as an extracurricular activity.

Keywords: robotics education, Kuwait's education, e-robot kit, research and development, innovation and creativity

Procedia PDF Downloads 414
2183 Analysis of Risk-Based Disaster Planning in Local Communities

Authors: R. A. Temah, L. A. Nkengla-Asi

Abstract:

Planning for future disasters sets the stage for a variety of activities that may trigger multiple recurring operations and expose the community to opportunities to minimize risks. Local communities are increasingly embracing the necessity for planning based on local risks, but are also significantly challenged to effectively plan and response to disasters. This research examines basic risk-based disaster planning model and compares it with advanced risk-based planning that introduces the identification and alignment of varieties of local capabilities within and out of the local community that can be pivotal to facilitate the management of local risks and cascading effects prior to a disaster. A critical review shows that the identification and alignment of capabilities can potentially enhance risk-based disaster planning. A tailored holistic approach to risk based disaster planning is pivotal to enhance collective action and a reduction in disaster collective cost.

Keywords: capabilities, disaster planning, hazards, local community, risk-based

Procedia PDF Downloads 204
2182 Controlling the Release of Cyt C and L- Dopa from pNIPAM-AAc Nanogel Based Systems

Authors: Sulalit Bandyopadhyay, Muhammad Awais Ashfaq Alvi, Anuvansh Sharma, Wilhelm R. Glomm

Abstract:

Release of drugs from nanogels and nanogel-based systems can occur under the influence of external stimuli like temperature, pH, magnetic fields and so on. pNIPAm-AAc nanogels respond to the combined action of both temperature and pH, the former being mostly determined by hydrophilic-to-hydrophobic transitions above the volume phase transition temperature (VPTT), while the latter is controlled by the degree of protonation of the carboxylic acid groups. These nanogels based systems are promising candidates in the field of drug delivery. Combining nanogels with magneto-plasmonic nanoparticles (NPs) introduce imaging and targeting modalities along with stimuli-response in one hybrid system, thereby incorporating multifunctionality. Fe@Au core-shell NPs possess optical signature in the visible spectrum owing to localized surface plasmon resonance (LSPR) of the Au shell, and superparamagnetic properties stemming from the Fe core. Although there exist several synthesis methods to control the size and physico-chemical properties of pNIPAm-AAc nanogels, yet, there is no comprehensive study that highlights the dependence of incorporation of one or more layers of NPs to these nanogels. In addition, effective determination of volume phase transition temperature (VPTT) of the nanogels is a challenge which complicates their uses in biological applications. Here, we have modified the swelling-collapse properties of pNIPAm-AAc nanogels, by combining with Fe@Au NPs using different solution based methods. The hydrophilic-hydrophobic transition of the nanogels above the VPTT has been confirmed to be reversible. Further, an analytical method has been developed to deduce the average VPTT which is found to be 37.3°C for the nanogels and 39.3°C for nanogel coated Fe@Au NPs. An opposite swelling –collapse behaviour is observed for the latter where the Fe@Au NPs act as bridge molecules pulling together the gelling units. Thereafter, Cyt C, a model protein drug and L-Dopa, a drug used in the clinical treatment of Parkinson’s disease were loaded separately into the nanogels and nanogel coated Fe@Au NPs, using a modified breathing-in mechanism. This gave high loading and encapsulation efficiencies (L Dopa: ~9% and 70µg/mg of nanogels, Cyt C: ~30% and 10µg/mg of nanogels respectively for both the drugs. The release kinetics of L-Dopa, monitored using UV-vis spectrophotometry was observed to be rather slow (over several hours) with highest release happening under a combination of high temperature (above VPTT) and acidic conditions. However, the release of L-Dopa from nanogel coated Fe@Au NPs was the fastest, accounting for release of almost 87% of the initially loaded drug in ~30 hours. The chemical structure of the drug, drug incorporation method, location of the drug and presence of Fe@Au NPs largely alter the drug release mechanism and the kinetics of these nanogels and Fe@Au NPs coated with nanogels.

Keywords: controlled release, nanogels, volume phase transition temperature, l-dopa

Procedia PDF Downloads 330
2181 Comparative Analysis of Strategies: Samsung vs. Xiaomi

Authors: Jae-Soo Do, Kyoung-Seok Kim

Abstract:

The crisis theory of Samsung Electronics is becoming a hot topic today. Due to its performance deterioration, the share of Samsung Electronics lost its driving power. Considering the public opinion about the bad rumors circulating within the company, it is quite probable that the company is currently facing crisis. Then, what company has challenged the stronghold of Samsung Electronics? At the core of the crisis is 'Xiaomi' who snatched the first place of the market share, pushing Samsung Electronics aside in the Chinese market. In June 2010, Xiaomi, established by eight co-founders, has been showing a miraculous growth as the smart device manufacturer, taking the first place in the Chinese market and coming in fifth worldwide in just four years after its establishment. How did Xiaomi instantaneously achieve enough growth to overtake Samsung? Thus, we have conducted a comparative analysis on the competitive strategies of Samsung and Xiaomi.

Keywords: Samsung, Xiaomi, industrial attractiveness, VIRO

Procedia PDF Downloads 395
2180 Incorporating Information Gain in Regular Expressions Based Classifiers

Authors: Rosa L. Figueroa, Christopher A. Flores, Qing Zeng-Treitler

Abstract:

A regular expression consists of sequence characters which allow describing a text path. Usually, in clinical research, regular expressions are manually created by programmers together with domain experts. Lately, there have been several efforts to investigate how to generate them automatically. This article presents a text classification algorithm based on regexes. The algorithm named REX was designed, and then, implemented as a simplified method to create regexes to classify Spanish text automatically. In order to classify ambiguous cases, such as, when multiple labels are assigned to a testing example, REX includes an information gain method Two sets of data were used to evaluate the algorithm’s effectiveness in clinical text classification tasks. The results indicate that the regular expression based classifier proposed in this work performs statically better regarding accuracy and F-measure than Support Vector Machine and Naïve Bayes for both datasets.

Keywords: information gain, regular expressions, smith-waterman algorithm, text classification

Procedia PDF Downloads 319
2179 Study the Effect of Rubbery Phase on Morphology Development of PP/PA6/(EPDM:EPDM-g-MA) Ternary Blends

Authors: B. Afsari, M. Hassanpour, M. Shabani

Abstract:

This study aimed to investigate the phase morphology of ternary blends comprising PP, PA6, and a blend of EPDM and EPDM-g-MA in a 70/15/15 ratio. Varying ratios of EPDM to EPDM-g-MA were examined. As the proportion of EPDM-g-MA increased, an interlayer phase formed between the dispersed PA6 domains and the PP matrix. This resulted in the development of a core-shell encapsulation morphology within the blends. The concentration of the EPDM-g-MA component is inversely correlated with the average size of PA6 particles. Additionally, blends containing higher proportions of the EPDM-g-MA rubbery phase exhibited an aggregated structure of the modifier particles. Notably, as the concentration of EPDM-g-MA increased from 0% to 15% in the blend, there was a consistent monotonic reduction in the size of PA6 particles.

Keywords: phase morphology, rubbery phase, rubber functionality, ternary blends

Procedia PDF Downloads 86
2178 A Minimum Spanning Tree-Based Method for Initializing the K-Means Clustering Algorithm

Authors: J. Yang, Y. Ma, X. Zhang, S. Li, Y. Zhang

Abstract:

The traditional k-means algorithm has been widely used as a simple and efficient clustering method. However, the algorithm often converges to local minima for the reason that it is sensitive to the initial cluster centers. In this paper, an algorithm for selecting initial cluster centers on the basis of minimum spanning tree (MST) is presented. The set of vertices in MST with same degree are regarded as a whole which is used to find the skeleton data points. Furthermore, a distance measure between the skeleton data points with consideration of degree and Euclidean distance is presented. Finally, MST-based initialization method for the k-means algorithm is presented, and the corresponding time complexity is analyzed as well. The presented algorithm is tested on five data sets from the UCI Machine Learning Repository. The experimental results illustrate the effectiveness of the presented algorithm compared to three existing initialization methods.

Keywords: degree, initial cluster center, k-means, minimum spanning tree

Procedia PDF Downloads 409
2177 Images of Spiritism in Brazilian Catholic Discourse (1889-1937)

Authors: Frantisek Kalenda

Abstract:

With the ultimate triumph of the republican movement in 1889 in Brazil and adoption of constitution promoting religious freedom, formerly dominant Roman Catholic Church entered a long period of struggle to recover its lost position, fighting both liberal and secular character of the new regime and rising competition on the “market of faith”. Spiritism in its originally Brazilian form proved to be one if its key adversaries during the First (1889-1930) and Second Republic (1930-1937), provoking significant attempt within official Church to discredit and destroy the movement. This paper explores this effort through Catholic portrayal of Spiritism in its official media, focusing, on the creation of stereotypes and both theological and “scientific” arguments used against it. Its core is based on primary sources’ analysis, mainly influential A Ordem and Mensageiro da Fé.

Keywords: Catholic Church, media, other, spiritism, stereotype

Procedia PDF Downloads 272
2176 Metal-Organic Frameworks-Based Materials for Volatile Organic Compounds Sensing Applications: Strategies to Improve Sensing Performances

Authors: Claudio Clemente, Valentina Gargiulo, Alessio Occhicone, Giovanni Piero Pepe, Giovanni Ausanio, Michela Alfè

Abstract:

Volatile organic compound (VOC) emissions represent a serious risk to human health and the integrity of the ecosystems, especially at high concentrations. For this reason, it is very important to continuously monitor environmental quality and develop fast and reliable portable sensors to allow analysis on site. Chemiresistors have become promising candidates for VOC sensing as their ease of fabrication, variety of suitable sensitive materials, and simple sensing data. A chemoresistive gas sensor is a transducer that allows to measure the concentration of an analyte in the gas phase because the changes in resistance are proportional to the amount of the analyte present. The selection of the sensitive material, which interacts with the target analyte, is very important for the sensor performance. The most used VOC detection materials are metal oxides (MOx) for their rapid recovery, high sensitivity to various gas molecules, easy fabrication. Their sensing performance can be improved in terms of operating temperature, selectivity, and detection limit. Metal-organic frameworks (MOFs) have attracted a lot of attention also in the field of gas sensing due to their high porosity, high surface area, tunable morphologies, structural variety. MOFs are generated by the self-assembly of multidentate organic ligands connecting with adjacent multivalent metal nodes via strong coordination interactions, producing stable and highly ordered crystalline porous materials with well-designed structures. However, most MOFs intrinsically exhibit low electrical conductivity. To improve this property, MOFs can be combined with organic and inorganic materials in a hybrid fashion to produce composite materials or can be transformed into more stable structures. MOFs, indeed, can be employed as the precursors of metal oxides with well-designed architectures via the calcination method. The MOF-derived MOx partially preserved the original structure with high surface area and intrinsic open pores, which act as trapping centers for gas molecules, and showed a higher electrical conductivity. Core-shell heterostructures, in which the surface of a metal oxide core is completely coated by a MOF shell, forming a junction at the core-shell heterointerface, can also be synthesized. Also, nanocomposite in which MOF structures are intercalated with graphene related materials can also be produced, and the conductivity increases thanks to the high mobility of electrons of carbon materials. As MOF structures, zinc-based MOFs belonging to the ZIF family were selected in this work. Several Zn-based materials based and/or derived from MOFs were produced, structurally characterized, and arranged in a chemo resistive architecture, also exploring the potentiality of different approaches of sensing layer deposition based on PLD (pulsed laser deposition) and, in case of thermally labile materials, MAPLE (Matrix Assisted Pulsed Laser Evaporation) to enhance the adhesion to the support. The sensors were tested in a controlled humidity chamber, allowing for the possibility of varying the concentration of ethanol, a typical analyte chosen among the VOCs for a first survey. The effect of heating the chemiresistor to improve sensing performances was also explored. Future research will focus on exploring new manufacturing processes for MOF-based gas sensors with the aim to improve sensitivity, selectivity and reduce operating temperatures.

Keywords: chemiresistors, gas sensors, graphene related materials, laser deposition, MAPLE, metal-organic frameworks, metal oxides, nanocomposites, sensing performance, transduction mechanism, volatile organic compounds

Procedia PDF Downloads 60
2175 DNA Multiplier: A Design Architecture of a Multiplier Circuit Using DNA Molecules

Authors: Hafiz Md. Hasan Babu, Khandaker Mohammad Mohi Uddin, Nitish Biswas, Sarreha Tasmin Rikta, Nuzmul Hossain Nahid

Abstract:

Nanomedicine and bioengineering use biological systems that can perform computing operations. In a biocomputational circuit, different types of biomolecules and DNA (Deoxyribose Nucleic Acid) are used as active components. DNA computing has the capability of performing parallel processing and a large storage capacity that makes it diverse from other computing systems. In most processors, the multiplier is treated as a core hardware block, and multiplication is one of the time-consuming and lengthy tasks. In this paper, cost-effective DNA multipliers are designed using algorithms of molecular DNA operations with respect to conventional ones. The speed and storage capacity of a DNA multiplier are also much higher than a traditional silicon-based multiplier.

Keywords: biological systems, DNA multiplier, large storage, parallel processing

Procedia PDF Downloads 212
2174 Interoperable Platform for Internet of Things at Home Applications

Authors: Fabiano Amorim Vaz, Camila Gonzaga de Araujo

Abstract:

With the growing number of personal devices such as smartphones, tablets, smart watches, among others, in addition to recent devices designed for IoT, it is observed that residential environment has potential to generate important information about our daily lives. Therefore, this work is focused on showing and evaluating a system that integrates all these technologies considering the context of a smart house. To achieve this, we define an architecture capable of supporting the amount of data generated and consumed at a residence and, mainly, the variety of this data presents. We organize it in a particular cloud containing information about robots, recreational vehicles, weather, in addition to data from the house, such as lighting, energy, security, among others. The proposed architecture can be extrapolated to various scenarios and applications. Through the core of this work, we can define new functionality for residences integrating them with more resources.

Keywords: cloud computing, IoT, robotics, smart house

Procedia PDF Downloads 381
2173 4D Monitoring of Subsurface Conditions in Concrete Infrastructure Prior to Failure Using Ground Penetrating Radar

Authors: Lee Tasker, Ali Karrech, Jeffrey Shragge, Matthew Josh

Abstract:

Monitoring for the deterioration of concrete infrastructure is an important assessment tool for an engineer and difficulties can be experienced with monitoring for deterioration within an infrastructure. If a failure crack, or fluid seepage through such a crack, is observed from the surface often the source location of the deterioration is not known. Geophysical methods are used to assist engineers with assessing the subsurface conditions of materials. Techniques such as Ground Penetrating Radar (GPR) provide information on the location of buried infrastructure such as pipes and conduits, positions of reinforcements within concrete blocks, and regions of voids/cavities behind tunnel lining. This experiment underlines the application of GPR as an infrastructure-monitoring tool to highlight and monitor regions of possible deterioration within a concrete test wall due to an increase in the generation of fractures; in particular, during a time period of applied load to a concrete wall up to and including structural failure. A three-point load was applied to a concrete test wall of dimensions 1700 x 600 x 300 mm³ in increments of 10 kN, until the wall structurally failed at 107.6 kN. At each increment of applied load, the load was kept constant and the wall was scanned using GPR along profile lines across the wall surface. The measured radar amplitude responses of the GPR profiles, at each applied load interval, were reconstructed into depth-slice grids and presented at fixed depth-slice intervals. The corresponding depth-slices were subtracted from each data set to compare the radar amplitude response between datasets and monitor for changes in the radar amplitude response. At lower values of applied load (i.e., 0-60 kN), few changes were observed in the difference of radar amplitude responses between data sets. At higher values of applied load (i.e., 100 kN), closer to structural failure, larger differences in radar amplitude response between data sets were highlighted in the GPR data; up to 300% increase in radar amplitude response at some locations between the 0 kN and 100 kN radar datasets. Distinct regions were observed in the 100 kN difference dataset (i.e., 100 kN-0 kN) close to the location of the final failure crack. The key regions observed were a conical feature located between approximately 3.0-12.0 cm depth from surface and a vertical linear feature located approximately 12.1-21.0 cm depth from surface. These key regions have been interpreted as locations exhibiting an increased change in pore-space due to increased mechanical loading, or locations displaying an increase in volume of micro-cracks, or locations showing the development of a larger macro-crack. The experiment showed that GPR is a useful geophysical monitoring tool to assist engineers with highlighting and monitoring regions of large changes of radar amplitude response that may be associated with locations of significant internal structural change (e.g. crack development). GPR is a non-destructive technique that is fast to deploy in a production setting. GPR can assist with reducing risk and costs in future infrastructure maintenance programs by highlighting and monitoring locations within the structure exhibiting large changes in radar amplitude over calendar-time.

Keywords: 4D GPR, engineering geophysics, ground penetrating radar, infrastructure monitoring

Procedia PDF Downloads 178