Search results for: Principle Component Analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30058

Search results for: Principle Component Analysis

29098 Development of Colorimetric Based Microfluidic Platform for Quantification of Fluid Contaminants

Authors: Sangeeta Palekar, Mahima Rana, Jayu Kalambe

Abstract:

In this paper, a microfluidic-based platform for the quantification of contaminants in the water is proposed. The proposed system uses microfluidic channels with an embedded environment for contaminants detection in water. Microfluidics-based platforms present an evident stage of innovation for fluid analysis, with different applications advancing minimal efforts and simplicity of fabrication. Polydimethylsiloxane (PDMS)-based microfluidics channel is fabricated using a soft lithography technique. Vertical and horizontal connections for fluid dispensing with the microfluidic channel are explored. The principle of colorimetry, which incorporates the use of Griess reagent for the detection of nitrite, has been adopted. Nitrite has high water solubility and water retention, due to which it has a greater potential to stay in groundwater, endangering aquatic life along with human health, hence taken as a case study in this work. The developed platform also compares the detection methodology, containing photodetectors for measuring absorbance and image sensors for measuring color change for quantification of contaminants like nitrite in water. The utilization of image processing techniques offers the advantage of operational flexibility, as the same system can be used to identify other contaminants present in water by introducing minor software changes.

Keywords: colorimetric, fluid contaminants, nitrite detection, microfluidics

Procedia PDF Downloads 199
29097 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice

Authors: Diana Reckien

Abstract:

Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.

Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity

Procedia PDF Downloads 395
29096 The Suffering Other and the Deserving Self; When Humanitarianism Intersects with Individualism and Neo-Liberalism

Authors: Irene Bruna Seu

Abstract:

This paper draws on a three-year research project investigating everyday moral reasoning in relation to donations and prosocial behaviour in the humanitarian context. The analysis focuses on the principle of deservingness by which members of the public decide who and under which conditions to help and illustrates how the speakers engage in ideological dilemmas. The paper focuses on the theme ‘Something for nothing’ to examine how the position of ‘deserving’ and the speaker’s rights and duties in relation to victims of humanitarian crises are negotiated. Discursive analyses of this dilemmatic storyline of deservingness illuminate the cultural and ideological resources buttressing this construction. They also illustrate how humanitarianism intersects and clashes with other ideologies and value systems. The presentation will focus on the role of Individualism underpinned by Neo-liberalism ideology. The data propose that neo-liberal ideology, which endorses self-gratification, materialistic and individualistic ethics play an important role in decisions regarding humanitarian helping. The paper argues for the need for psychological research to engage more actively with the dilemmatic nature of moral reasoning in the humanitarian context, and to contextualize decisions about giving and helping within the socio-cultural and ideological landscape in which the helpers operate.

Keywords: humanitarianism, individualism, ideological dilemmas, discourse, neo-liberalism, prosocial behaviour

Procedia PDF Downloads 213
29095 Corpus-Based Analysis on the Translatability of Conceptual Vagueness in Traditional Chinese Medicine Classics Huang Di Nei Jing

Authors: Yan Yue

Abstract:

Huang Di Nei Jing (HDNJ) is one of the significant traditional Chinese medicine (TCM) classics which lays the foundation of TCM theory and practice. It is an important work for the world to study the ancient civilizations and medical history of China. Language in HDNJ is highly concise and vague, and notably challenging to translate. This paper investigates the translatability of one particular vagueness in HDNJ: the conceptual vagueness which carries the Chinese philosophical and cultural connotations. The corpora tool Sketch Engine is used to provide potential online contexts and word behaviors. Selected two English translations of HDNJ by TCM practitioner and non-practitioner are used to examine frequency and distribution of linguistic features of the translation. It was found the hypothesis about the universals of translated language (explicitation, normalisation) is true in one translation, but it is on the sacrifice of some original contextual connotations. Transliteration is purposefully used in the second translation to retain the original flavor, which is argued as a violation of the principle of relevance in communication because it yields little contextual effects and demands more processing effort of the reader. The translatability of conceptual vagueness in HDNJ is constrained by source language context and the reader’s cognitive environment.

Keywords: corpus-based translation, translatability, TCM classics, vague language

Procedia PDF Downloads 377
29094 Issues and Challenges in Social Work Field Education: The Field Coordinator's Perspective

Authors: Tracy B.E. Omorogiuwa

Abstract:

Understanding the role of social work in improving societal well-being cannot be separated from the place of field education, which is an integral aspect of social work education. Field learning provides students with knowledge and opportunities to experience solving issues in the field and giving them a clue of the practice situation. Despite being a crucial component in social work curriculum, field education occupies a large space in learning outcome, given the issues and challenges pertaining to its purpose and significance in the society. The drive of this paper is to provide insight on the specific ways in which field education has been conceived, realized and valued in the society. Emphasis is on the significance of field instruction; the link with classroom learning; and the structure of field experience in social work education. Given documented analysis and experience, this study intends to contribute to the development of social work curriculum, by analyzing the pattern, issues and challenges fronting the social work field education in the University of Benin, Nigeria.

Keywords: challenges, curriculum, field education, social work education

Procedia PDF Downloads 299
29093 Prefabricated Integral Design of Building Services

Authors: Mina Mortazavi

Abstract:

The common approach in the construction industry for restraint requirements in existing structures or new constructions is to have Non-Structural Components (NSCs) assembled and installed on-site by different MEP subcontractors. This leads to a lack of coordination and higher costs, construction time, and complications due to inaccurate building information modelling (BIM) systems. Introducing NSCs to a consistent BIM system from the beginning of the design process and considering their seismic loads in the analysis and design process can improve coordination and reduce costs and time. One solution is to use prefabricated mounts with attached MEPs delivered as an integral module. This eliminates the majority of coordination complications and reduces design and installation costs and time. An advanced approach is to have as many NSCs as possible installed in the same prefabricated module, which gives the structural engineer the opportunity to consider the involved component weights and locations in the analysis and design of the prefabricated support. This efficient approach eliminates coordination and access issues, leading to enhanced quality control. This research will focus on the existing literature on modular sub-assemblies that are integrated with architectural and structural components. Modular MEP systems take advantage of the precision provided by BIM tools to meet exact requirements and achieve a buildable design every time. Modular installations that include MEP systems provide efficient solutions for the installation of MEP services or components.

Keywords: building services, modularisation, prefabrication, integral building design

Procedia PDF Downloads 72
29092 Food Effects and Food Choices: Aligning the Two for Better Health

Authors: John Monro, Suman Mishra

Abstract:

Choosing foods for health benefits requires information that accurately represents the relative effectiveness of foods with respect to specific health end points, or with respect to responses leading to health outcomes. At present consumers must rely on nutrient composition data, and on health claims to guide them to healthy food choices. Nutrient information may be of limited usefulness because it does not reflect the effect of food structure and food component interactions – that is, whole food effects. Health claims demand stringent criteria that exclude most foods, even though most foods have properties through which they may contribute to positive health outcomes in a diet. In this presentation, we show how the functional efficacy of foods may be expressed in the same format as nutrients, with weight units, as virtual food components that allow a nutrition information panel to show not only what a food is, but also what it does. In the presentation, two body responses linked to well-being are considered – glycaemic response and colonic bulk – in order to illustrate the concept. We show how the nutrient information on available carbohydrates and dietary fibre values obtained by food analysis methods fail to provide information of the glycaemic potency or the colonic bulking potential of foods, because of failings in the methods and approach taken to food analysis. It is concluded that a category of food values that represent the functional efficacy of foods is required to accurately guide food choices for health.

Keywords: dietary fibre, glycaemic response, food values, food effects, health

Procedia PDF Downloads 502
29091 Wear Resistance and Mechanical Performance of Ultra-High Molecular Weight Polyethylene Influenced by Temperature Change

Authors: Juan Carlos Baena, Zhongxiao Peng

Abstract:

Ultra-high molecular weight polyethylene (UHMWPE) is extensively used in industrial and biomedical fields. The slippery nature of UHMWPE makes this material suitable for surface bearing applications, however, the operational conditions limit the lubrication efficiency, inducing boundary and mixed lubrication in the tribological system. The lack of lubrication in a tribological system intensifies friction, contact stress and consequently, operating temperature. With temperature increase, the material’s mechanical properties are affected, and the lifespan of the component is reduced. The understanding of how mechanical properties and wear performance of UHMWPE change when the temperature is increased has not been clearly identified. The understanding of the wear and mechanical performance of UHMWPE at different temperature is important to predict and further improve the lifespan of these components. This study evaluates the effects of temperature variation in a range of 20 °C to 60 °C on the hardness and the wear resistance of UHMWPE. A reduction of the hardness and wear resistance was observed with the increase in temperature. The variation of the wear rate increased 94.8% when the temperature changed from 20 °C to 50 °C. Although hardness is regarded to be an indicator of the material wear resistance, this study found that wear resistance decreased more rapidly than hardness with the temperature increase, evidencing a low material stability of this component in a short temperature interval. The reduction of the hardness was reflected by the plastic deformation and abrasion intensity, resulting in a significant wear rate increase.

Keywords: hardness, surface bearing, tribological system, UHMWPE, wear

Procedia PDF Downloads 271
29090 Interactive Glare Visualization Model for an Architectural Space

Authors: Florina Dutt, Subhajit Das, Matthew Swartz

Abstract:

Lighting design and its impact on indoor comfort conditions are an integral part of good interior design. Impact of lighting in an interior space is manifold and it involves many sub components like glare, color, tone, luminance, control, energy efficiency, flexibility etc. While other components have been researched and discussed multiple times, this paper discusses the research done to understand the glare component from an artificial lighting source in an indoor space. Consequently, the paper discusses a parametric model to convey real time glare level in an interior space to the designer/ architect. Our end users are architects and likewise for them it is of utmost importance to know what impression the proposed lighting arrangement and proposed furniture layout will have on indoor comfort quality. This involves specially those furniture elements (or surfaces) which strongly reflect light around the space. Essentially, the designer needs to know the ramification of the ‘discomfortable glare’ at the early stage of design cycle, when he still can afford to make changes to his proposed design and consider different routes of solution for his client. Unfortunately, most of the lighting analysis tools that are present, offer rigorous computation and analysis on the back end eventually making it challenging for the designer to analyze and know the glare from interior light quickly. Moreover, many of them do not focus on glare aspect of the artificial light. That is why, in this paper, we explain a novel approach to approximate interior glare data. Adding to that we visualize this data in a color coded format, expressing the implications of their proposed interior design layout. We focus on making this analysis process very fluid and fast computationally, enabling complete user interaction with the capability to vary different ranges of user inputs adding more degrees of freedom for the user. We test our proposed parametric model on a case study, a Computer Lab space in our college facility.

Keywords: computational geometry, glare impact in interior space, info visualization, parametric lighting analysis

Procedia PDF Downloads 350
29089 Revolutionary Solutions for Modeling and Visualization of Complex Software Systems

Authors: Jay Xiong, Li Lin

Abstract:

Existing software modeling and visualization approaches using UML are outdated, which are outcomes of reductionism and the superposition principle that the whole of a system is the sum of its parts, so that with them all tasks of software modeling and visualization are performed linearly, partially, and locally. This paper introduces revolutionary solutions for modeling and visualization of complex software systems, which make complex software systems much easy to understand, test, and maintain. The solutions are based on complexity science, offering holistic, automatic, dynamic, virtual, and executable approaches about thousand times more efficient than the traditional ones.

Keywords: complex systems, software maintenance, software modeling, software visualization

Procedia PDF Downloads 401
29088 Impacts of Artificial Intelligence on the Doctor-Patient Relationship: Ethical Principles, Informed Consent and Medical Obligation

Authors: Rafaella Nogaroli

Abstract:

It is presented hypothetical cases in the context of AI algorithms to support clinical decisions, in order to discuss the importance of doctors to respect AI ethical principles. Regarding the principle of transparency and explanation, there is an impact on the new model of patient consent and on the understanding of qualified information. Besides, the human control of technology (AI as a tool) should guide the physician's activity; otherwise, he breaks the patient's legitimate expectation in a specific result, with the consequent transformation of the medical obligation nature.

Keywords: medical law, artificial intelligence, ethical principles, patient´s informed consent, medical obligations

Procedia PDF Downloads 102
29087 Pressure-Controlled Dynamic Equations of the PFC Model: A Mathematical Formulation

Authors: Jatupon Em-Udom, Nirand Pisutha-Arnond

Abstract:

The phase-field-crystal, PFC, approach is a density-functional-type material model with an atomic resolution on a diffusive timescale. Spatially, the model incorporates periodic nature of crystal lattices and can naturally exhibit elasticity, plasticity and crystal defects such as grain boundaries and dislocations. Temporally, the model operates on a diffusive timescale which bypasses the need to resolve prohibitively small atomic-vibration time steps. The PFC model has been used to study many material phenomena such as grain growth, elastic and plastic deformations and solid-solid phase transformations. In this study, the pressure-controlled dynamic equation for the PFC model was developed to simulate a single-component system under externally applied pressure; these coupled equations are important for studies of deformable systems such as those under constant pressure. The formulation is based on the non-equilibrium thermodynamics and the thermodynamics of crystalline solids. To obtain the equations, the entropy variation around the equilibrium point was derived. Then the resulting driving forces and flux around the equilibrium were obtained and rewritten as conventional thermodynamic quantities. These dynamics equations are different from the recently-proposed equations; the equations in this study should provide more rigorous descriptions of the system dynamics under externally applied pressure.

Keywords: driving forces and flux, evolution equation, non equilibrium thermodynamics, Onsager’s reciprocal relation, phase field crystal model, thermodynamics of single-component solid

Procedia PDF Downloads 305
29086 Research on the Ecological Impact Evaluation Index System of Transportation Construction Projects

Authors: Yu Chen, Xiaoguang Yang, Lin Lin

Abstract:

Traffic engineering construction is an important infrastructure for economic and social development. In the process of construction and operation, the ability to make a correct evaluation of the project's environmental impact appears to be crucial to the rational operation of existing transportation projects, the correct development of transportation engineering construction and the adoption of corresponding measures to scientifically carry out environmental protection work. Most of the existing research work on ecological and environmental impact assessment is limited to individual aspects of the environment and less to the overall evaluation of the environmental system; in terms of research conclusions, there are more qualitative analyses from the technical and policy levels, and there is a lack of quantitative research results and quantitative and operable evaluation models. In this paper, a comprehensive analysis of the ecological and environmental impacts of transportation construction projects is conducted, and factors such as the accessibility of data and the reliability of calculation results are comprehensively considered to extract indicators that can reflect the essence and characteristics. The qualitative evaluation indicators were screened using the expert review method, the qualitative indicators were measured using the fuzzy statistics method, the quantitative indicators were screened using the principal component analysis method, and the quantitative indicators were measured by both literature search and calculation. An environmental impact evaluation index system with the general objective layer, sub-objective layer and indicator layer was established, dividing the environmental impact of the transportation construction project into two periods: the construction period and the operation period. On the basis of the evaluation index system, the index weights are determined using the hierarchical analysis method, and the individual indicators to be evaluated are dimensionless, eliminating the influence of the original background and meaning of the indicators. Finally, the thesis uses the above research results, combined with the actual engineering practice, to verify the correctness and operability of the evaluation method.

Keywords: transportation construction projects, ecological and environmental impact, analysis and evaluation, indicator evaluation system

Procedia PDF Downloads 105
29085 Evaluating Psychosocial Influence of Dental Aesthetics: A Cross-Sectional Study

Authors: Mahjabeen Akbar

Abstract:

Dental aesthetics and its associated psychosocial influence have a significant impact on individuals. Correcting malocclusions is a key motivating factor for majority patients; however, psychosocial factors have been rarely incorporated in evaluating malocclusions. Therefore, it is necessary to study the psychosocial influence of malocclusion in patients. The study aimed to determine the psychosocial influence of dental aesthetics in dental students by the ‘Psychosocial Impact of Dental Aesthetics Questionnaire’ and self-rated Aesthetic Component of the Index of Orthodontic Treatment Need (IOTN). This was a quantitative study using a cross-sectional study design. One hundred twenty dental students (71 females and 49 males; mean age 24.5) were selected via purposive sampling from July to August 2019. Dental students with no former orthodontic treatment were requested to fill out the ‘Psychosocial Impact of Dental Aesthetics Questionnaire.’ Variables including; self-confidence/insecurity, social influence, psychological influence and self-perception of the need of an orthodontic treatment were evaluated by a sequence of statements, while dental aesthetics were evaluated by using the IOTN Aesthetic Component. To determine the significance, the Kruskal-Wallis test was utilized. The results show that all four variables measuring psychosocial impact indicated significant correlations with the perceived malocclusions with a p-value of less than 0.01. The results conclude there is a strong psychological and social influence of altered dental aesthetics on an individual. Moreover, the relationship between the IOTN-AC grading with the psychosocial wellbeing of an individual stands proven, indicating that the perception of altered dental aesthetics is as important as a factor in treatment need as the amount of malocclusion.

Keywords: dental aesthetics, malocclusion, psychosocial influence, dental students

Procedia PDF Downloads 152
29084 Generative Syntaxes: Macro-Heterophony and the Form of ‘Synchrony’

Authors: Luminiţa Duţică, Gheorghe Duţică

Abstract:

One of the most powerful language innovation in the twentieth century music was the heterophony–hypostasis of the vertical syntax entered into the sphere of interest of many composers, such as George Enescu, Pierre Boulez, Mauricio Kagel, György Ligeti and others. The heterophonic syntax has a history of its growth, which means a succession of different concepts and writing techniques. The trajectory of settling this phenomenon does not necessarily take into account the chronology: there are highly complex primary stages and advanced stages of returning to the simple forms of writing. In folklore, the plurimelodic simultaneities are free or random and originate from the (unintentional) differences/‘deviations’ from the state of unison, through a variety of ornaments, melismas, imitations, elongations and abbreviations, all in a flexible rhythmic and non-periodic/immeasurable framework, proper to the parlando-rubato rhythmics. Within the general framework of the multivocal organization, the heterophonic syntax in elaborate (academic) version has imposed itself relatively late compared with polyphony and homophony. Of course, the explanation is simple, if we consider the causal relationship between the sound vocabulary elements – in this case, the modalism – and the typologies of vertical organization appropriate for it. Therefore, adding up the ‘classic’ pathway of the writing typologies (monody – polyphony – homophony), heterophony - applied equally to the structures of modal, serial or synthesis vocabulary – reclaims necessarily an own macrotemporal form, in the sense of the analogies enshrined by the evolution of the musical styles and languages: polyphony→fugue, homophony→sonata. Concerned about the prospect of edifying a new musical ontology, the composer Ştefan Niculescu experienced – along with the mathematical organization of heterophony according to his own original methods – the possibility of extrapolation of this phenomenon in macrostructural plan, reaching this way to the unique form of ‘synchrony’. Founded on coincidentia oppositorum principle (involving the ‘one-multiple’ binom), the sound architecture imagined by Ştefan Niculescu consists in one (temporal) model / algorithm of articulation of two sound states: 1. monovocality state (principle of identity) and 2. multivocality state (principle of difference). In this context, the heterophony becomes an (auto)generative mechanism, with macrotemporal amplitude, strategy that will be grown by the composer, practically throughout his creation (see the works: Ison I, Ison II, Unisonos I, Unisonos II, Duplum, Triplum, Psalmus, Héterophonies pour Montreux (Homages to Enescu and Bartók etc.). For the present demonstration, we selected one of the most edifying works of Ştefan Niculescu – Simphony II, Opus dacicum – where the form of (heterophony-)synchrony acquires monumental-symphonic features, representing an emblematic case for the complexity level achieved by this type of vertical syntax in the twentieth century music.

Keywords: heterophony, modalism, serialism, synchrony, syntax

Procedia PDF Downloads 345
29083 Monitorization of Junction Temperature Using a Thermal-Test-Device

Authors: B. Arzhanov, A. Correia, P. Delgado, J. Meireles

Abstract:

Due to the higher power loss levels in electronic components, the thermal design of PCBs (Printed Circuit Boards) of an assembled device becomes one of the most important quality factors in electronics. Nonetheless, some of leading causes of the microelectronic component failures are due to higher temperatures, the leakages or thermal-mechanical stress, which is a concern, is the reliability of microelectronic packages. This article presents an experimental approach to measure the junction temperature of exposed pad packages. The implemented solution is in a prototype phase, using a temperature-sensitive parameter (TSP) to measure temperature directly on the die, validating the numeric results provided by the Mechanical APDL (Ansys Parametric Design Language) under same conditions. The physical device-under-test is composed by a Thermal Test Chip (TTC-1002) and assembly in a QFN cavity, soldered to a test-board according to JEDEC Standards. Monitoring the voltage drop across a forward-biased diode, is an indirectly method but accurate to obtain the junction temperature of QFN component with an applied power range between 0,3W to 1.5W. The temperature distributions on the PCB test-board and QFN cavity surface were monitored by an infra-red thermal camera (Goby-384) controlled and images processed by the Xeneth software. The article provides a set-up to monitorize in real-time the junction temperature of ICs, namely devices with the exposed pad package (i.e. QFN). Presenting the PCB layout parameters that the designer should use to improve thermal performance, and evaluate the impact of voids in solder interface in the device junction temperature.

Keywords: quad flat no-Lead packages, exposed pads, junction temperature, thermal management and measurements

Procedia PDF Downloads 286
29082 Contact-Impact Analysis of Continuum Compliant Athletic Systems

Authors: Theddeus Tochukwu Akano, Omotayo Abayomi Fakinlede

Abstract:

Proper understanding of the behavior of compliant mechanisms use by athletes is important in order to avoid catastrophic failure. Such compliant mechanisms like the flex-run require the knowledge of their dynamic response and deformation behavior under quickly varying loads. The modeling of finite deformations of the compliant athletic system is described by Neo-Hookean model under contact-impact conditions. The dynamic impact-contact governing equations for both the target and impactor are derived based on the updated Lagrangian approach. A method where contactor and target are considered as a united body is applied in the formulation of the principle of virtual work for the bodies. In this paper, methods of continuum mechanics and nonlinear finite element method were deployed to develop a model that could capture the behavior of the compliant athletic system under quickly varying loads. A hybrid system of symbolic algebra (AceGEN) and a compiled back end (AceFEM) were employed, leveraging both ease of use and computational efficiency. The simulated results reveal the effect of the various contact-impact conditions on the deformation behavior of the impacting compliant mechanism.

Keywords: eigenvalue problems, finite element method, robin boundary condition, sturm-liouville problem

Procedia PDF Downloads 473
29081 Dependence of Densification, Hardness and Wear Behaviors of Ti6Al4V Powders on Sintering Temperature

Authors: Adewale O. Adegbenjo, Elsie Nsiah-Baafi, Mxolisi B. Shongwe, Mercy Ramakokovhu, Peter A. Olubambi

Abstract:

The sintering step in powder metallurgy (P/M) processes is very sensitive as it determines to a large extent the properties of the final component produced. Spark plasma sintering over the past decade has been extensively used in consolidating a wide range of materials including metallic alloy powders. This novel, non-conventional sintering method has proven to be advantageous offering full densification of materials, high heating rates, low sintering temperatures, and short sintering cycles over conventional sintering methods. Ti6Al4V has been adjudged the most widely used α+β alloy due to its impressive mechanical performance in service environments, especially in the aerospace and automobile industries being a light metal alloy with the capacity for fuel efficiency needed in these industries. The P/M route has been a promising method for the fabrication of parts made from Ti6Al4V alloy due to its cost and material loss reductions and the ability to produce near net and intricate shapes. However, the use of this alloy has been largely limited owing to its relatively poor hardness and wear properties. The effect of sintering temperature on the densification, hardness, and wear behaviors of spark plasma sintered Ti6Al4V powders was investigated in this present study. Sintering of the alloy powders was performed in the 650–850°C temperature range at a constant heating rate, applied pressure and holding time of 100°C/min, 50 MPa and 5 min, respectively. Density measurements were carried out according to Archimedes’ principle and microhardness tests were performed on sectioned as-polished surfaces at a load of 100gf and dwell time of 15 s. Dry sliding wear tests were performed at varied sliding loads of 5, 15, 25 and 35 N using the ball-on-disc tribometer configuration with WC as the counterface material. Microstructural characterization of the sintered samples and wear tracks were carried out using SEM and EDX techniques. The density and hardness characteristics of sintered samples increased with increasing sintering temperature. Near full densification (99.6% of the theoretical density) and Vickers’ micro-indentation hardness of 360 HV were attained at 850°C. The coefficient of friction (COF) and wear depth improved significantly with increased sintering temperature under all the loading conditions examined, except at 25 N indicating better mechanical properties at high sintering temperatures. Worn surface analyses showed the wear mechanism was a synergy of adhesive and abrasive wears, although the former was prevalent.

Keywords: hardness, powder metallurgy, spark plasma sintering, wear

Procedia PDF Downloads 273
29080 A Review of Spatial Analysis as a Geographic Information Management Tool

Authors: Chidiebere C. Agoha, Armstong C. Awuzie, Chukwuebuka N. Onwubuariri, Joy O. Njoku

Abstract:

Spatial analysis is a field of study that utilizes geographic or spatial information to understand and analyze patterns, relationships, and trends in data. It is characterized by the use of geographic or spatial information, which allows for the analysis of data in the context of its location and surroundings. It is different from non-spatial or aspatial techniques, which do not consider the geographic context and may not provide as complete of an understanding of the data. Spatial analysis is applied in a variety of fields, which includes urban planning, environmental science, geosciences, epidemiology, marketing, to gain insights and make decisions about complex spatial problems. This review paper explores definitions of spatial analysis from various sources, including examples of its application and different analysis techniques such as Buffer analysis, interpolation, and Kernel density analysis (multi-distance spatial cluster analysis). It also contrasts spatial analysis with non-spatial analysis.

Keywords: aspatial technique, buffer analysis, epidemiology, interpolation

Procedia PDF Downloads 319
29079 Entrepreneurship under the Effect of Information Technology

Authors: Mohammad Hadi Khorashadi Zadeh ‎

Abstract:

An entrepreneur is a manager or the owner of the commercial company that creates resources and money by risking and initiative. The Netpreneur is the capability to run an online business. It needs only the Connectivity. An Entrepreneur, as long as he has a service which the market demands can set up a feasible and viable trade with his Intellectual Capital as the principle input and the Connectivity Infrastructure as the only physical input. The internet is possibly the most significant revolution in science and technology that our generation could fantasize or imagine. It has introduced in various benefits to the society, culture, economics and politics. The entrepreneur is a premium member in the community. She/he provides services to the society and community including employment.

Keywords: entrepreneur, Netpreneur, intellectual capital, infrastructure

Procedia PDF Downloads 327
29078 Sustainable Design for Building Envelope in Hot Climates: A Case Study for the Role of the Dome as a Component of an Envelope in Heat Exchange

Authors: Akeel Noori Almulla Hwaish

Abstract:

Architectural design is influenced by the actual thermal behaviour of building components, and this in turn depends not only on their steady and periodic thermal characteristics, but also on exposure effects, orientation, surface colour, and climatic fluctuations at the given location. Design data and environmental parameters should be produced in an accurate way for specified locations, so that architects and engineers can confidently apply them in their design calculations that enable precise evaluation of the influence of various parameters relating to each component of the envelope, which indicates overall thermal performance of building. The present paper will be carried out with an objective of thermal behaviour assessment and characteristics of the opaque and transparent parts of one of the very unique components used as a symbolic distinguished element of building envelope, its thermal behaviour under the impact of solar temperatures, and its role in heat exchange related to a specific U-value of specified construction materials alternatives. The research method will consider the specified Hot-Dry weather and new mosque in Baghdad, Iraq as a case study. Also, data will be presented in light of the criteria of indoor thermal comfort in terms of design parameters and thermal assessment for a“model dome”. Design alternatives and considerations of energy conservation, will be discussed as well using comparative computer simulations. Findings will be incorporated to outline the conclusions clarifying the important role of the dome in heat exchange of the whole building envelope for approaching an indoor thermal comfort level and further research in the future.

Keywords: building envelope, sustainable design, dome impact, hot-climates, heat exchange

Procedia PDF Downloads 475
29077 Research of Data Cleaning Methods Based on Dependency Rules

Authors: Yang Bao, Shi Wei Deng, WangQun Lin

Abstract:

This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.

Keywords: data cleaning, dependency rules, violation data discovery, data repair

Procedia PDF Downloads 564
29076 Durability Analysis of a Knuckle Arm Using VPG System

Authors: Geun-Yeon Kim, S. P. Praveen Kumar, Kwon-Hee Lee

Abstract:

A steering knuckle arm is the component that connects the steering system and suspension system. The structural performances such as stiffness, strength, and durability are considered in its design process. The former study suggested the lightweight design of a knuckle arm considering the structural performances and using the metamodel-based optimization. The six shape design variables were defined, and the optimum design was calculated by applying the kriging interpolation method. The finite element method was utilized to predict the structural responses. The suggested knuckle was made of the aluminum Al6082, and its weight was reduced about 60% in comparison with the base steel knuckle, satisfying the design requirements. Then, we investigated its manufacturability by performing foraging analysis. The forging was done as hot process, and the product was made through two-step forging. As a final step of its developing process, the durability is investigated by using the flexible dynamic analysis software, LS-DYNA and the pre and post processor, eta/VPG. Generally, a car make does not provide all the information with the part manufacturer. Thus, the part manufacturer has a limit in predicting the durability performance with the unit of full car. The eta/VPG has the libraries of suspension, tire, and road, which are commonly used parts. That makes a full car modeling. First, the full car is modeled by referencing the following information; Overall Length: 3,595mm, Overall Width: 1,595mm, CVW (Curve Vehicle Weight): 910kg, Front Suspension: MacPherson Strut, Rear Suspension: Torsion Beam Axle, Tire: 235/65R17. Second, the road is selected as the cobblestone. The road condition of the cobblestone is almost 10 times more severe than that of usual paved road. Third, the dynamic finite element analysis using the LS-DYNA is performed to predict the durability performance of the suggested knuckle arm. The life of the suggested knuckle arm is calculated as 350,000km, which satisfies the design requirement set up by the part manufacturer. In this study, the overall design process of a knuckle arm is suggested, and it can be seen that the developed knuckle arm satisfies the design requirement of the durability with the unit of full car. The VPG analysis is successfully performed even though it does not an exact prediction since the full car model is very rough one. Thus, this approach can be used effectively when the detail to full car is not given.

Keywords: knuckle arm, structural optimization, Metamodel, forging, durability, VPG (Virtual Proving Ground)

Procedia PDF Downloads 419
29075 Rotational Energy Recovery System

Authors: Vijayendra Anil Menon, Ashwath Narayan Murali

Abstract:

The present day vehicles do not reuse the energy expelled in running the vehicle. The energy used to run the vehicle is expelled immediately.This has remained a constant for many decades. With all the vehicles running on non-renewable resources like fossil fuels, there is an urgent need to improve efficiency of the vehicles until a reliable replacement for fossil fuels is found.Our design is based on the concept of Kinetic energy recovery systems. Though our design lies in principle with the KERS, our design can be used in day-to-day driving. With our design, efficiency of vehicles increases and fuel conservation is possible thereby reducing the carbon footprint.

Keywords: KERS, Battery, Wheels, Efficiency.

Procedia PDF Downloads 393
29074 Numerical Analysis of Solar Cooling System

Authors: Nadia Allouache, Mohamed Belmedani

Abstract:

Energy source is a sustainable, totally inexhaustible and environmentally friendly alternative to the fossil fuels available. It is a renewable and economical energy that can be harnessed sustainably over the long term and thus stabilizes energy costs. Solar cooling technologies have been developed to decrease the augmentation electricity consumption for air conditioning and to displace the peak load during hot summer days. A numerical analysis of thermal and solar performances of an annular finned adsorber, which is the most important component of the adsorption solar refrigerating system, is considered in this work. Different adsorbent/adsorbate pairs, such as activated carbon AC35/methanol, activated carbon AC35/ethanol, and activated carbon BPL/Ammoniac, are undertaken in this study. The modeling of the adsorption cooling machine requires the resolution of the equation describing the energy and mass transfer in the tubular finned adsorber. The Wilson and Dubinin- Astakhov models of the solid-adsorbate equilibrium are used to calculate the adsorbed quantity. The porous medium and the fins are contained in the annular space, and the adsorber is heated by solar energy. Effects of key parameters on the adsorbed quantity and on the thermal and solar performances are analysed and discussed. The AC35/methanol pair is the best pair compared to BPL/Ammoniac and AC35/ethanol pairs in terms of system performance. The system performances are sensitive to the fin geometry. For the considered data measured for clear type days of July 2023 in Algeria and Morocco, the performances of the cooling system are very significant in Algeria.

Keywords: activated carbon AC35-methanol pair, activated carbon AC35-ethanol pair, activated carbon BPL-ammoniac pair, annular finned adsorber, performance coefficients, numerical analysis, solar cooling system

Procedia PDF Downloads 55
29073 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems

Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos

Abstract:

Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.

Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system

Procedia PDF Downloads 183
29072 Study on the DC Linear Stepper Motor to Industrial Applications

Authors: Nolvi Francisco Baggio Filho, Roniele Belusso

Abstract:

Many industrial processes require a precise linear motion. Usually, this movement is achieved with the use of rotary motors combined with electrical control systems and mechanical systems such as gears, pulleys and bearings. Other types of devices are based on linear motors, where the linear motion is obtained directly. The Linear Stepper Motor (MLP) is an excellent solution for industrial applications that require precise positioning and high speed. This study presents an MLP formed by a linear structure and static ferromagnetic material, and a mover structure in which three coils are mounted. Mechanical suspension systems allow a linear movement between static and mover parts, maintaining a constant air gap. The operating principle is based on the tendency of alignment of magnetic flux through the path of least reluctance. The force proportional to the intensity of the electric current and the speed proportional to the frequency of the excitation coils. The study of this device is still based on the use of a numerical and experimental analysis to verify the relationship among electric current applied and planar force developed. In addition, the magnetic field in the air gap region is also monitored.

Keywords: linear stepper motor, planar traction force, reluctance magnetic, industry applications

Procedia PDF Downloads 500
29071 Structural Inequality and Precarious Workforce: The Role of Labor Laws in Destabilizing the Labor Force in Iran

Authors: Iman Shabanzadeh

Abstract:

Over the last three decades, the main demands of the Iranian workforce have been focused on three areas: "The right to a decent wage", "The right to organize" and "The right to job security". In order to investigate and analyze this situation, the present study focuses on the component of job security. The purpose of the study is to figure out what mechanisms in Iran's Labor Law have led to the destabilization and undermining of workers' job security. The research method is descriptive-analytical. To collect information, library and document sources in the field of laws related to labor rights in Iran and, semi-structured interviews with experts have been used. In the data analysis stage, the qualitative content analysis method was also used. The trend analysis of the statistics related to the labor force situation in Iran in the last three decades shows that the employment structure has been facing an increase in the active population, but in the last decade, a large part of this population has been mainly active in the service sector, and contract-free enterprises, so a smaller share of this employment has insurance coverage and a larger share has underemployment. In this regard, the results of this study show that four contexts have been proposed as the main legal and executive mechanisms of labor instability in Iran, which are: 1) temporaryization of the labor force by providing different interpretations of labor law, 2) adjustment labor in the public sector and the emergence of manpower contracting companies, 3) the cessation of labor law protection of workers in small workshops and 4) the existence of numerous restrictions on the effective organization of workers. The theoretical conclusion of this article is that the main root of the challenges of the labor society and the destabilized workforce in Iran is the existence of structural inequalities in the field of labor security, whose traces can be seen in the legal provisions and executive regulations of this field.

Keywords: inequality, precariat, temporaryization, labor force, labor law

Procedia PDF Downloads 61
29070 Developing Oral Communication Competence in a Second Language: The Communicative Approach

Authors: Ikechi Gilbert

Abstract:

Oral communication is the transmission of ideas or messages through the speech process. Acquiring competence in this area which, by its volatile nature, is prone to errors and inaccuracies would require the adoption of a well-suited teaching methodology. Efficient oral communication facilitates exchange of ideas and easy accomplishment of day-to-day tasks, by means of a demonstrated mastery of oral expression and the making of fine presentations to audiences or individuals while recognizing verbal signals and body language of others and interpreting them correctly. In Anglophone states such as Nigeria, Ghana, etc., the French language, for instance, is studied as a foreign language, being used majorly in teaching learners who have their own mother tongue different from French. The same applies to Francophone states where English is studied as a foreign language by people whose official language or mother tongue is different from English. The ideal approach would be to teach these languages in these environments through a pedagogical approach that properly takes care of the oral perspective for effective understanding and application by the learners. In this article, we are examining the communicative approach as a methodology for teaching oral communication in a foreign language. This method is a direct response to the communicative needs of the learner involving the use of appropriate materials and teaching techniques that meet those needs. It is also a vivid improvement to the traditional grammatical and audio-visual adaptations. Our contribution will focus on the pedagogical component of oral communication improvement, highlighting its merits and also proposing diverse techniques including aspects of information and communication technology that would assist the second language learner communicate better orally.

Keywords: communication, competence, methodology, pedagogical component

Procedia PDF Downloads 266
29069 COVID-19 Genomic Analysis and Complete Evaluation

Authors: Narin Salehiyan, Ramin Ghasemi Shayan

Abstract:

In order to investigate coronavirus RNA replication, transcription, recombination, protein processing and transport, virion assembly, the identification of coronavirus-specific cell receptors, and polymerase processing, the manipulation of coronavirus clones and complementary DNAs (cDNAs) of defective-interfering (DI) RNAs is the subject of this chapter. The idea of the Covid genome is nonsegmented, single-abandoned, and positive-sense RNA. When compared to other RNA viruses, its size is significantly greater, ranging from 27 to 32 kb. The quality encoding the enormous surface glycoprotein depends on 4.4 kb, encoding a forcing trimeric, profoundly glycosylated protein. This takes off exactly 20 nm over the virion envelope, giving the infection the appearance-with a little creative mind of a crown or coronet. Covid research has added to the comprehension of numerous parts of atomic science as a general rule, like the component of RNA union, translational control, and protein transport and handling. It stays a fortune equipped for creating startling experiences.

Keywords: covid-19, corona, virus, genome, genetic

Procedia PDF Downloads 72