Search results for: predicting models
2605 An Engineer-Oriented Life Cycle Assessment Tool for Building Carbon Footprint: The Building Carbon Footprint Evaluation System in Taiwan
Authors: Hsien-Te Lin
Abstract:
The purpose of this paper is to introduce the BCFES (building carbon footprint evaluation system), which is a LCA (life cycle assessment) tool developed by the Low Carbon Building Alliance (LCBA) in Taiwan. A qualified BCFES for the building industry should fulfill the function of evaluating carbon footprint throughout all stages in the life cycle of building projects, including the production, transportation and manufacturing of materials, construction, daily energy usage, renovation and demolition. However, many existing BCFESs are too complicated and not very designer-friendly, creating obstacles in the implementation of carbon reduction policies. One of the greatest obstacle is the misapplication of the carbon footprint inventory standards of PAS2050 or ISO14067, which are designed for mass-produced goods rather than building projects. When these product-oriented rules are applied to building projects, one must compute a tremendous amount of data for raw materials and the transportation of construction equipment throughout the construction period based on purchasing lists and construction logs. This verification method is very cumbersome by nature and unhelpful to the promotion of low carbon design. With a view to provide an engineer-oriented BCFE with pre-diagnosis functions, a component input/output (I/O) database system and a scenario simulation method for building energy are proposed herein. Most existing BCFESs base their calculations on a product-oriented carbon database for raw materials like cement, steel, glass, and wood. However, data on raw materials is meaningless for the purpose of encouraging carbon reduction design without a feedback mechanism, because an engineering project is not designed based on raw materials but rather on building components, such as flooring, walls, roofs, ceilings, roads or cabinets. The LCBA Database has been composited from existing carbon footprint databases for raw materials and architectural graphic standards. Project designers can now use the LCBA Database to conduct low carbon design in a much more simple and efficient way. Daily energy usage throughout a building's life cycle, including air conditioning, lighting, and electric equipment, is very difficult for the building designer to predict. A good BCFES should provide a simplified and designer-friendly method to overcome this obstacle in predicting energy consumption. In this paper, the author has developed a simplified tool, the dynamic Energy Use Intensity (EUI) method, to accurately predict energy usage with simple multiplications and additions using EUI data and the designed efficiency levels for the building envelope, AC, lighting and electrical equipment. Remarkably simple to use, it can help designers pre-diagnose hotspots in building carbon footprint and further enhance low carbon designs. The BCFES-LCBA offers the advantages of an engineer-friendly component I/O database, simplified energy prediction methods, pre-diagnosis of carbon hotspots and sensitivity to good low carbon designs, making it an increasingly popular carbon management tool in Taiwan. To date, about thirty projects have been awarded BCFES-LCBA certification and the assessment has become mandatory in some cities.Keywords: building carbon footprint, life cycle assessment, energy use intensity, building energy
Procedia PDF Downloads 1392604 A Parametric Study on Lateral Torsional Buckling of European IPN and IPE Cantilevers
Authors: H. Ozbasaran
Abstract:
IPN and IPE sections, which are commonly used European I shapes, are widely used in steel structures as cantilever beams to support overhangs. A considerable number of studies exist on calculating lateral torsional buckling load of I sections. However, most of them provide series solutions or complex closed-form equations. In this paper, a simple equation is presented to calculate lateral torsional buckling load of IPN and IPE section cantilever beams. First, differential equation of lateral torsional buckling is solved numerically for various loading cases. Then a parametric study is conducted on results to present an equation for lateral torsional buckling load of European IPN and IPE beams. Finally, results obtained by presented equation are compared to differential equation solutions and finite element model results. ABAQUS software is utilized to generate finite element models of beams. It is seen that the results obtained from presented equation coincide with differential equation solutions and ABAQUS software results. It can be suggested that presented formula can be safely used to calculate critical lateral torsional buckling load of European IPN and IPE section cantilevers.Keywords: cantilever, IPN, IPE, lateral torsional buckling
Procedia PDF Downloads 5402603 The Batch Method Approach for Adsorption Mechanism Processes of Some Selected Heavy Metal Ions and Methylene Blue by Using Chemically Modified Luffa Cylindrica
Authors: Akanimo Emene, Mark D. Ogden, Robert Edyvean
Abstract:
Adsorption is a low cost, efficient and economically viable wastewater treatment process. Utilization of this treatment process has not been fully applied due to the complex and not fully understood nature of the adsorption system. To optimize its process is to choose a sufficient adsorbent and to study further the experimental parameters that influence the adsorption design system. Chemically modified adsorbent, Luffa cylindrica, was used to adsorb heavy metal ions and an organic pollutant, methylene blue, from aqueous environmental solution at varying experimental conditions. Experimental factors, adsorption time, initial metal ion or organic pollutant concentration, ionic strength, and pH of solution were studied. The experimental data were analyzed with kinetic and isotherm models. The antagonistic effect of the methylene and some heavy metal ions were recorded. An understanding of the use of this treated Luffa cylindrica for the removal of these toxic substances will establish and improve the commercial application of the adsorption process in treatment of contaminated waters.Keywords: adsorption, heavy metal ions, Luffa cylindrica, wastewater treatment
Procedia PDF Downloads 1972602 Determination of Weld Seam Thickness in Welded Connection Subjected to Local Buckling Effects
Authors: Tugrul Tulunay, Iyas Devran Celik
Abstract:
When the materials used in structural steel industry are evaluated, box beam profiles are considerably preferred. As a result of the cross-sectional properties that these profiles possess, the connection of these profiles to each other and to profiles having different types of cross sections is becoming viable by means of additional measures. An important point to note in such combinations is continuous transfer of internal forces from element to element. At the beginning to ensure this continuity, header plate is needed to use. The connection of the plates to the elements works mainly through welds. In this study, it is aimed to determine the ideal welding thickness in box beam under bending effect and the joints exposed to local buckles that will form in the column. The connection with box column and box beam designed in this context was made by means of corner and circular filler welds. Corner welds of different thickness and analysis by types with different lengths depending on plate dimensions in numerical models were made with the help of ANSYS Workbench program and examined behaviours.Keywords: welding thickness, box beam-column joints, design of steel structures, calculation and construction principles 2016, welded joints under local buckling
Procedia PDF Downloads 1672601 Evaluating the Prominence of Chemical Phenomena in Chemistry Courses
Authors: Vanessa R. Ralph, Leah J. Scharlott, Megan Y. Deshaye, Ryan L. Stowe
Abstract:
Given the traditions of chemistry teaching, one may not question whether chemical phenomena play a prominent role. Yet, the role of chemical phenomena in an introductory chemistry course may define the extent to which the course is introductory, chemistry, and equitable. Picture, for example, the classic Ideal Gas Law problem. If one envisions a prompt wherein students are tasked with calculating a missing variable, then one envisions a prompt that relies on chemical phenomena as a context rather than as a model to understand the natural world. Consider a prompt wherein students are tasked with applying molecular models of gases to explain why the vapor pressure of a gaseous solution of water differs from that of carbon dioxide. Here, the chemical phenomenon is not only the context but also the subject of the prompt. Deliveries of general and organic chemistry were identified as ranging wildly in the integration of chemical phenomena. The more incorporated the phenomena, the more equitable the assessment task was for students of varying access to pre-college math and science preparation. How chemical phenomena are integrated may very well define whether courses are chemistry, are introductory, and are equitable. Educators of chemistry are invited colleagues to discuss the role of chemical phenomena in their courses and consider the long-lasting impacts of replicating tradition for tradition’s sake.Keywords: equitable educational practices, chemistry curriculum, content organization, assessment design
Procedia PDF Downloads 1972600 Finite Element Simulation of an Offshore Monopile Subjected to Cyclic Loading Using Hypoplasticity with Intergranular Strain Anisotropy (ISA) for the Soil
Authors: William Fuentes, Melany Gil
Abstract:
Numerical simulations of offshore wind turbines (OWTs) in shallow waters demand sophisticated models considering the cyclic nature of the environmental loads. For the case of an OWT founded on sands, rapid loading may cause a reduction of the effective stress of the soil surrounding the structure. This eventually leads to its settlement, tilting, or other issues affecting its serviceability. In this work, a 3D FE model of an OWT founded on sand is constructed and analyzed. Cyclic loading with different histories is applied at certain points of the tower to simulate some environmental forces. The mechanical behavior of the soil is simulated through the recently proposed ISA-hypoplastic model for sands. The Intergranular Strain Anisotropy ISA can be interpreted as an enhancement of the intergranular strain theory, often used to extend hypoplastic formulations for the simulation of cyclic loading. In contrast to previous formulations, the proposed constitutive model introduces an elastic range for small strain amplitudes, includes the cyclic mobility effect and is able to capture the cyclic behavior of sands under a larger number of cycles. The model performance is carefully evaluated on the FE dynamic analysis of the OWT.Keywords: offshore wind turbine, monopile, ISA, hypoplasticity
Procedia PDF Downloads 2462599 Shear Strengthening of RC T-Beams by Means of CFRP Sheets
Authors: Omar A. Farghal
Abstract:
This research aimed to experimentally and analytically investigate the contribution of bonded web carbon fiber reinforced polymer (CFRP) sheets to the shear strength of reinforced concrete (RC) T-beams. Two strengthening techniques using CFRP strips were applied along the shear-span zone: the first one is vertical U-jacket and the later is vertical strips bonded to the beam sides only. Fibers of both U-jacket and side sheets were vertically oriented (θ = 90°). Test results showed that the strengthening technique with U-jacket CFRP sheets improved the shear strength particularly. Three mechanisms of failure were recognized for the tested beams depending upon the end condition of the bonded CFRP sheet. Although the failure mode for the different beams was a brittle one, the strengthened beams provided with U-jacket CFRP sheets showed more or less a ductile behavior at a higher loading level up to a load level just before failure. As a consequence, these beams approved an acceptable enhancement in the structural ductility. Moreover, the obtained results concerning both the strains induced in the CFRP sheets and the maximum loads are used to study the applicability of the analytical models proposed in this study (ACI code) to predict: the nominal shear strength of the strengthened beams.Keywords: carbon fiber reinforced polymer, wrapping, ductility, shear strengthening
Procedia PDF Downloads 2552598 Adsorption of Malachite Green Dye on Graphene Oxide Nanosheets from Aqueous Solution: Kinetics and Thermodynamics Studies
Authors: Abeer S. Elsherbiny, Ali H. Gemeay
Abstract:
In this study, graphene oxide (GO) nanosheets have been synthesized and characterized using different spectroscopic tools such as X-ray diffraction spectroscopy, infrared Fourier transform (FT-IR) spectroscopy, BET specific surface area and Transmission Electronic Microscope (TEM). The prepared GO was investigated for the removal of malachite green, a cationic dye from aqueous solution. The removal methods of malachite green has been proceeded via adsorption process. GO nanosheets can be predicted as a good adsorbent material for the adsorption of cationic species. The adsorption of the malachite green onto the GO nanosheets has been carried out at different experimental conditions such as adsorption kinetics, concentration of adsorbate, pH, and temperature. The kinetics of the adsorption data were analyzed using four kinetic models such as the pseudo first-order model, pseudo second-order model, intraparticle diffusion, and the Boyd model to understand the adsorption behavior of malachite green onto the GO nanosheets and the mechanism of adsorption. The adsorption isotherm of adsorption of the malachite green onto the GO nanosheets has been investigated at 25, 35 and 45 °C. The equilibrium data were fitted well to the Langmuir model. Various thermodynamic parameters such as the Gibbs free energy (ΔG°), enthalpy (ΔH°), and entropy (ΔS°) change were also evaluated. The interaction of malachite green onto the GO nanosheets has been investigated by infrared Fourier transform (FT-IR) spectroscopy.Keywords: adsorption, graphene oxide, kinetics, malachite green
Procedia PDF Downloads 4112597 Rheology Study of Polyurethane (COAPUR 6050) For Composite Materials Usage
Authors: Sabrina Boutaleb, Kouider Halim Benrahou, François Schosseler, Abdelouahed Tounsi, El Abbas Adda Bedia
Abstract:
The use of polyurethane in different areas becomes more frequent. This is due to significant advantages they have including their lightness and resistance. However, their use requires a mastery of their mechanical performance. We will present in this work, a COAPUR 6050 which can be used to develop composite materials. COAPUR 6050 is an associative polyurethane thickener allowing fine rheological adjustment of flat or semi-gloss paints. COAPUR 6050 is characterised by its thickening efficiency at low shear rate. It is a solvent-free liquid product. It promotes good paint pick up, while maintaining a low yield point after shearing, and consequently a good levelling. We will then determine its rheological behaviour experimentally using different annular gaps. The rheological properties of COAPUR 6050 were researched by rotational rheometer (Rheometer-Mars III) using different annular gaps. There is the influence of the size of the annular gap on the behaviour as well as on the rheological parameters of the COAPUR 6050. The rheological properties data of COAPUR 6050 were regressed by nonlinear regression method and their rheological models were established, are characterized by yield pseudoplastic model. In this case, it is essential to make a viscometric correction. The latter was developed and presented in the experimental results.Keywords: COAPUR 6050, flow’s couette, polyurethane, rheological behaviours
Procedia PDF Downloads 5012596 Conceptual Model Providing More Information on the Contact Situation between Crime Victim and the Police
Authors: M. Inzunza
Abstract:
In contemporary society, victims of crime has been given more recognition, which have contributed to advancing the knowledge on the effects of crime. There exists a complexity of who gets the status of victim and that the typology of good versus bad can interfere with the contact situation of the victim with the police. The aim of this study is to identify the most central areas affecting the contact situation between crime victims and the police to develop a conceptual model to be useful empirically. By considering previously documented problem areas and different theoretical domains, a conceptual model has been developed. Preliminary findings suggest that an area that should be given attention is to get a better understanding of the victim, not only in terms of demographics but also in terms of risk behavior and social network. This area has been considered to influence the status of the crime victim. Another domain of value is the type of crime and the context of the incident in more detail. The police officer approach style in the contact situation is also a pertinent area that is influenced by how the police based victim services are organized and how individual police officers are suited for the mission. Suitability includes constructs from empathy models adapted to the police context and especially focusing on sub-constructs such as perspective taking. Discussion will focus on how these findings can be operationalized in practice and how they are used in ongoing empirical studies.Keywords: empathy, perspective taking, police contact, victim of crime
Procedia PDF Downloads 1372595 Multiscale Computational Approach to Enhance the Understanding, Design and Development of CO₂ Catalytic Conversion Technologies
Authors: Agnieszka S. Dzielendziak, Lindsay-Marie Armstrong, Matthew E. Potter, Robert Raja, Pier J. A. Sazio
Abstract:
Reducing carbon dioxide, CO₂, is one of the greatest global challenges. Conversion of CO₂ for utilisation across synthetic fuel, pharmaceutical, and agrochemical industries offers a promising option, yet requires significant research to understanding the complex multiscale processes involved. To experimentally understand and optimize such processes at that catalytic sites and exploring the impact of the process at reactor scale, is too expensive. Computational methods offer significant insight and flexibility but require a more detailed multi-scale approach which is a significant challenge in itself. This work introduces a computational approach which incorporates detailed catalytic models, taken from experimental investigations, into a larger-scale computational flow dynamics framework. The reactor-scale species transport approach is modified near the catalytic walls to determine the influence of catalytic clustering regions. This coupling approach enables more accurate modelling of velocity, pressures, temperatures, species concentrations and near-wall surface characteristics which will ultimately enable the impact of overall reactor design on chemical conversion performance.Keywords: catalysis, CCU, CO₂, multi-scale model
Procedia PDF Downloads 2532594 Driver Behavior Analysis and Inter-Vehicular Collision Simulation Approach
Authors: Lu Zhao, Nadir Farhi, Zoi Christoforou, Nadia Haddadou
Abstract:
The safety test of deploying intelligent connected vehicles (ICVs) on the road network is a critical challenge. Road traffic network simulation can be used to test the functionality of ICVs, which is not only time-saving and less energy-consuming but also can create scenarios with car collisions. However, the relationship between different human driver behaviors and the car-collision occurrences has been not understood clearly; meanwhile, the procedure of car-collisions generation in the traffic numerical simulators is not fully integrated. In this paper, we propose an approach to identify specific driver profiles from real driven data; then, we replicate them in numerical traffic simulations with the purpose of generating inter-vehicular collisions. We proposed three profiles: (i) 'aggressive': short time-headway, (ii) 'inattentive': long reaction time, and (iii) 'normal' with intermediate values of reaction time and time-headway. These three driver profiles are extracted from the NGSIM dataset and simulated using the intelligent driver model (IDM), with an extension of reaction time. At last, the generation of inter-vehicular collisions is performed by varying the percentages of different profiles.Keywords: vehicular collisions, human driving behavior, traffic modeling, car-following models, microscopic traffic simulation
Procedia PDF Downloads 1712593 Integrating Assurance and Risk Management of Complex Systems
Authors: Odd Ivar Haugen
Abstract:
This paper explores the relationship between assurance, risk, and risk management in the context of complex safety-related systems. It introduces a nuanced understanding of assurance and argues that the foundation for grounds for justified confidence in claims made about a complex system is related to the system behaviour. It emphasises the importance of knowledge as the cornerstone of assurance. The paper addresses the challenges of epistemic and aleatory uncertainties inherent in safety-critical systems. A systems approach is proposed to model emergent properties and complexity using the composition, environment, structure, mechanisms (CESM) metamodel, offering a structured framework for analysing system behaviour. The interplay between assurance and risk management is conceptualised through two models: the domain model and the control model. Assurance and risk management are mutually dependent on each other to reduce uncertainty and control risk levels. This work highlights the dual roles of assurance in risk management, acting as an epistemic actuator on the one side and providing feedback about the strength of the justification on the other. Assurance and risk management have inseparable roles in ensuring safety in complex systems.Keywords: assurance, CESM metamodel, confidence, emergent properties, knowledge, objectivity, risk, system behaviour, system safety
Procedia PDF Downloads 42592 Explainable Graph Attention Networks
Authors: David Pham, Yongfeng Zhang
Abstract:
Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.Keywords: explainable AI, graph attention network, graph neural network, node classification
Procedia PDF Downloads 1992591 Modelling of Structures by Advanced Finites Elements Based on the Strain Approach
Authors: Sifeddine Abderrahmani, Sonia Bouafia
Abstract:
The finite element method is the most practical tool for the analysis of structures, whatever the geometrical shape and behavior. It is extensively used in many high-tech industries, such as civil or military engineering, for the modeling of bridges, motor bodies, fuselages, and airplane wings. Additionally, experience demonstrates that engineers like modeling their structures using the most basic finite elements. Numerous models of finite elements may be utilized in the numerical analysis depending on the interpolation field that is selected, and it is generally known that convergence to the proper value will occur considerably more quickly with a good displacement pattern than with a poor pattern, saving computation time. The method for creating finite elements using the strain approach (S.B.A.) is presented in this presentation. When the results are compared with those provided by equivalent displacement-based elements, having the same total number of degrees of freedom, an excellent convergence can be obtained through some application and validation tests using recently developed membrane elements, plate bending elements, and flat shell elements. The effectiveness and performance of the strain-based finite elements in modeling structures are proven by the findings for deflections and stresses.Keywords: finite elements, plate bending, strain approach, displacement formulation, shell element
Procedia PDF Downloads 992590 Digitalization of Functional Safety - Increasing Productivity while Reducing Risks
Authors: Michael Scott, Phil Jarrell
Abstract:
Digitalization seems to be everywhere these days. So if one was to digitalize Functional Safety, what would that require: • Ability to directly use data from intelligent P&IDs / process design in a PHA / LOPA • Ability to directly use data from intelligent P&IDs in the SIS Design to support SIL Verification Calculations, SRS, C&Es, Functional Test Plans • Ability to create Unit Operation / SIF Libraries to radically reduce engineering manhours while ensuring consistency and improving quality of SIS designs • Ability to link data directly from a PHA / LOPA to SIS Designs • Ability to leverage reliability models and SRS details from SIS Designs to automatically program the Safety PLC • Ability to leverage SIS Test Plans to automatically create Safety PLC application logic Test Plans for a virtual FAT • Ability to tie real-time data from Process Historians / CMMS to assumptions in the PHA / LOPA and SIS Designs to generate leading indicators on protection layer health • Ability to flag SIS bad actors for proactive corrective actions prior to a near miss or loss of containment event What if I told you all of this was available today? This paper will highlight how the digital revolution has revolutionized the way Safety Instrumented Systems are designed, configured, operated and maintained.Keywords: IEC 61511, safety instrumented systems, functional safety, digitalization, IIoT
Procedia PDF Downloads 1812589 Studies on Dye Removal by Aspergillus niger Strain
Authors: M. S. Mahmoud, Samah A. Mohamed, Neama A. Sobhy
Abstract:
For color removal from wastewater containing organic contaminants, biological treatment systems have been widely used such as physical and chemical methods of flocculation, coagulation. Fungal decolorization of dye containing wastewater is one of important goal in industrial wastewater treatment. This work was aimed to characterize Aspergillus niger strain for dye removal from aqueous solution and from raw textile wastewater. Batch experiments were studied for removal of color using fungal isolate biomass under different conditions. Environmental conditions like pH, contact time, adsorbent dose and initial dye concentration were studied. Influence of the pH on the removal of azo dye by Aspergillus niger was carried out between pH 1.0 and pH 11.0. The optimum pH for red dye decolonization was 9.0. Results showed the decolorization of dye was decreased with the increase of its initial dye concentration. The adsorption data was analyzed based on the models of equilibrium isotherm (Freundlich model and Langmuir model). During the adsorption isotherm studies; dye removal was better fitted to Freundlich model. The isolated fungal biomass was characterized according to its surface area both pre and post the decolorization process by Scanning Electron Microscope (SEM) analysis. Results indicate that the isolated fungal biomass showed higher affinity for dye in decolorization process.Keywords: biomass, biosorption, dye, isotherms
Procedia PDF Downloads 3052588 Walmart Sales Forecasting using Machine Learning in Python
Authors: Niyati Sharma, Om Anand, Sanjeev Kumar Prasad
Abstract:
Assuming future sale value for any of the organizations is one of the major essential characteristics of tactical development. Walmart Sales Forecasting is the finest illustration to work with as a beginner; subsequently, it has the major retail data set. Walmart uses this sales estimate problem for hiring purposes also. We would like to analyzing how the internal and external effects of one of the largest companies in the US can walk out their Weekly Sales in the future. Demand forecasting is the planned prerequisite of products or services in the imminent on the basis of present and previous data and different stages of the market. Since all associations is facing the anonymous future and we do not distinguish in the future good demand. Hence, through exploring former statistics and recent market statistics, we envisage the forthcoming claim and building of individual goods, which are extra challenging in the near future. As a result of this, we are producing the required products in pursuance of the petition of the souk in advance. We will be using several machine learning models to test the exactness and then lastly, train the whole data by Using linear regression and fitting the training data into it. Accuracy is 8.88%. The extra trees regression model gives the best accuracy of 97.15%.Keywords: random forest algorithm, linear regression algorithm, extra trees classifier, mean absolute error
Procedia PDF Downloads 1492587 Theology and Music in the XXI. Century: An Exploratory Study of Current Interrelation
Authors: Andrzej Kesiak
Abstract:
Contemporary theology is often accused of answering questions that nobody is asking, and of employing hermetic language that has lost its communication capacity. There is also a question that theology is asking itself: how theological discourse can still be influential on other disciplines and, how to overcome the separation of theology and belief. Undoubtedly, in the wider spectrum, the theological discourse has been and will be needed. The difficulty is how to find the right model of it, the model that would help theology to enter in dialogue with culture, art, science, and politics. Presumably, there is no only one such model, theology constantly needs to seek such models, and this is probably a never-ending journey; in other words, theology should adopt a profile of ‘a restless being’ if it wants to remain influential. Music, on the other hand, has always been very close to theology; in fact, a huge part of classical music is either sacred or religious. Many composers sought inspiration in religion, liturgy, religious painting and sacred texts. This paper will argue that despite all that it seems that a proper and factual dialogue is still in a starting phase. Such a thing as a reciprocal relationship between theology and music definitely exists, but it has not yet been theoretically developed enough. Correlation between musical and theological disciplines constitutes a very broad and complex discourse. Therefore this study would rather narrow the subject and put it in a specific context: Theology and Music in the XXI. Century. This paper is a text-based study; therefore it will be based on textual-analysis with elements of the text hermeneutics.Keywords: music, theology, reciprocal relationship between theology and music, XXI Century
Procedia PDF Downloads 1582586 Spontaneous and Posed Smile Detection: Deep Learning, Traditional Machine Learning, and Human Performance
Authors: Liang Wang, Beste F. Yuksel, David Guy Brizan
Abstract:
A computational model of affect that can distinguish between spontaneous and posed smiles with no errors on a large, popular data set using deep learning techniques is presented in this paper. A Long Short-Term Memory (LSTM) classifier, a type of Recurrent Neural Network, is utilized and compared to human classification. Results showed that while human classification (mean of 0.7133) was above chance, the LSTM model was more accurate than human classification and other comparable state-of-the-art systems. Additionally, a high accuracy rate was maintained with small amounts of training videos (70 instances). The derivation of important features to further understand the success of our computational model were analyzed, and it was inferred that thousands of pairs of points within the eyes and mouth are important throughout all time segments in a smile. This suggests that distinguishing between a posed and spontaneous smile is a complex task, one which may account for the difficulty and lower accuracy of human classification compared to machine learning models.Keywords: affective computing, affect detection, computer vision, deep learning, human-computer interaction, machine learning, posed smile detection, spontaneous smile detection
Procedia PDF Downloads 1252585 An Artificial Intelligence Supported QUAL2K Model for the Simulation of Various Physiochemical Parameters of Water
Authors: Mehvish Bilal, Navneet Singh, Jasir Mushtaq
Abstract:
Water pollution puts people's health at risk, and it can also impact the ecology. For practitioners of integrated water resources management (IWRM), water quality modelling may be useful for informing decisions about pollution control (such as discharge permitting) or demand management (such as abstraction permitting). To comprehend the current pollutant load, movement of effective load movement of contaminants generates effective relation between pollutants, mathematical simulation, source, and water quality is regarded as one of the best estimating tools. The current study involves the Qual2k model, which includes manual simulation of the various physiochemical characteristics of water. To this end, various sensors could be installed for the automatic simulation of various physiochemical characteristics of water. An artificial intelligence model has been proposed for the automatic simulation of water quality parameters. Models of water quality have become an effective tool for identifying worldwide water contamination, as well as the ultimate fate and behavior of contaminants in the water environment. Water quality model research is primarily conducted in Europe and other industrialized countries in the first world, where theoretical underpinnings and practical research are prioritized.Keywords: artificial intelligence, QUAL2K, simulation, physiochemical parameters
Procedia PDF Downloads 1052584 A Neural Network Model to Simulate Urban Air Temperatures in Toulouse, France
Authors: Hiba Hamdi, Thomas Corpetti, Laure Roupioz, Xavier Briottet
Abstract:
Air temperatures are generally higher in cities than in their rural surroundings. The overheating of cities is a direct consequence of increasing urbanization, characterized by the artificial filling of soils, the release of anthropogenic heat, and the complexity of urban geometry. This phenomenon, referred to as urban heat island (UHI), is more prevalent during heat waves, which have increased in frequency and intensity in recent years. In the context of global warming and urban population growth, helping urban planners implement UHI mitigation and adaptation strategies is critical. In practice, the study of UHI requires air temperature information at the street canyon level, which is difficult to obtain. Many urban air temperature simulation models have been proposed (mostly based on physics or statistics), all of which require a variety of input parameters related to urban morphology, land use, material properties, or meteorological conditions. In this paper, we build and evaluate a neural network model based on Urban Weather Generator (UWG) model simulations and data from meteorological stations that simulate air temperature over Toulouse, France, on days favourable to UHI.Keywords: air temperature, neural network model, urban heat island, urban weather generator
Procedia PDF Downloads 912583 Urban Resilience: Relation between COVID-19 and Urban Environment in Amman City
Authors: Layla Mujahed
Abstract:
COVID-19 is an exam for all the city’s systems. It shows many gaps in the systems such as healthcare, economic, social, and environment. This pandemic is paving for a new era, an era of technology and it has changed people’s lives, such as physical, and emotional changes, and converting communication into digitalized. The effect of COVID-19 has covered all urban city parts. COVID-19 will not be the last pandemic our cities will face. For that, more researches focus on enhancing the quality of the urban environment. This pandemic encourages a rethinking of the environment’s role, especially in cities. Cities are trying to provide the best suitable strategies and regulations to prevent the spread of COVID-19, and an example of that is Amman city. Amman has a high increment in the number of COVID-19 infected people, while it has controlled the situation for months. For that, this paper studies the relation between COVID-19 and urban environmental studies cases about cities around the world, and learns from their models to face COVID-19. In Amman, people’s behavior has changed towards public transportation and public green spaces. New governmental regulations focus on increasing people’s mental awareness, supporting local businesses, and enhancing neighborhood planning that can help Amman to face any future pandemics.Keywords: COVID-19, urban environment, urban planning, urban resilience
Procedia PDF Downloads 1232582 Understanding Children’s Visual Attention to Personal Protective Equipment Using Eye-Tracking
Authors: Vanessa Cho, Janet Hsiao, Nigel King, Robert Anthonappa
Abstract:
Background: The personal protective equipment (PPE) requirements for health care workers (HCWs) have changed significantly during the COVID-19 pandemic. Aim: To ascertain, using eye-tracking technology, what children notice the most when seeing HCWs in various PPE. Design: A Tobii nano pro-eye-tracking camera tracked 156 children's visual attention while they viewed photographs of HCWs in various PPEs. Eye Movement analysis with Hidden Markov Models (EMHMM) was employed to analyse 624 recordings using two approaches, namely (i) data-driven where children's fixation determined the regions of interest (ROIs), and (ii) fixed ROIs where the investigators predefined the ROIs. Results: Two significant eye movement patterns, namely distributed(85.2%) and selective(14.7%), were identified(P<0.05). Most children fixated primarily on the face regardless of the different PPEs. Children fixated equally on all PPE images in the distributed pattern, while a strong preference for unmasked faces was evident in the selective pattern (P<0.01). Conclusion: Children as young as 2.5 years used a top-down visual search behaviour and demonstrated their face processing ability. Most children did not show a strong visual preference for a specific PPE, while a minority preferred PPE with distinct facial features, namely without masks and loupes.Keywords: COVID-19, PPE, dentistry, pediatric
Procedia PDF Downloads 902581 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts
Authors: William Michael Short
Abstract:
Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics
Procedia PDF Downloads 1322580 Wage Differentiation Patterns of Households Revisited for Turkey in Same Industry Employment: A Pseudo-Panel Approach
Authors: Yasin Kutuk, Bengi Yanik Ilhan
Abstract:
Previous studies investigate the wage differentiations among regions in Turkey between couples who work in the same industry and those who work in different industries by using the models that is appropriate for cross sectional data. However, since there is no available panel data for this investigation in Turkey, pseudo panels using repeated cross-section data sets of the Household Labor Force Surveys 2004-2014 are employed in order to open a new way to examine wage differentiation patterns. For this purpose, household heads are separated into groups with respect to their household composition. These groups’ membership is assumed to be fixed over time such as age groups, education, gender, and NUTS1 (12 regions) Level. The average behavior of them can be tracked overtime same as in the panel data. Estimates using the pseudo panel data would be consistent with the estimates using genuine panel data on individuals if samples are representative of the population which has fixed composition, characteristics. With controlling the socioeconomic factors, wage differentiation of household income is affected by social, cultural and economic changes after global economic crisis emerged in US. It is also revealed whether wage differentiation is changing among the birth cohorts.Keywords: wage income, same industry, pseudo panel, panel data econometrics
Procedia PDF Downloads 3972579 Orthodontic Treatment Using CAD/CAM System
Authors: Cristiane C. B. Alves, Livia Eisler, Gustavo Mota, Kurt Faltin Jr., Cristina L. F. Ortolani
Abstract:
The correct positioning of the brackets is essential for the success of orthodontic treatment. Indirect bracket placing technique has the main objective of eliminating the positioning errors, which commonly occur in the technique of direct system of brackets. The objective of this study is to demonstrate that the exact positioning of the brackets is of extreme relevance for the success of the treatment. The present work shows a case report of an adult female patient who attended the clinic with the complaint of being in orthodontic treatment for more than 5 years without noticing any progress. As a result of the intra-oral clinical examination and documentation analysis, a class III malocclusion, an anterior open bite, and absence of all third molars and first upper and lower bilateral premolars were observed. For the treatment, the indirect bonding technique with self-ligating ceramic braces was applied. The preparation of the trays was done after the intraoral digital scanning and printing of models with a 3D printer. Brackets were positioned virtually, using a specialized software. After twelve months of treatment, correction of the malocclusion was observed, as well as the closing of the anterior open bite. It is concluded that the adequate and precise positioning of brackets is necessary for a successful treatment.Keywords: anterior open-bite, CAD/CAM, orthodontics, malocclusion, angle class III
Procedia PDF Downloads 1942578 A Model for Solid Transportation Problem with Three Hierarchical Objectives under Uncertain Environment
Authors: Wajahat Ali, Shakeel Javaid
Abstract:
In this study, we have developed a mathematical programming model for a solid transportation problem with three objective functions arranged in hierarchical order. The mathematical programming models with more than one objective function to be solved in hierarchical order is termed as a multi-level programming model. Our study explores a Multi-Level Solid Transportation Problem with Uncertain Parameters (MLSTPWU). The proposed MLSTPWU model consists of three objective functions, viz. minimization of transportation cost, minimization of total transportation time, and minimization of deterioration during transportation. These three objective functions are supposed to be solved by decision-makers at three consecutive levels. Three constraint functions are added to the model, restricting the total availability, total demand, and capacity of modes of transportation. All the parameters involved in the model are assumed to be uncertain in nature. A solution method based on fuzzy logic is also discussed to obtain the compromise solution for the proposed model. Further, a simulated numerical example is discussed to establish the efficiency and applicability of the proposed model.Keywords: solid transportation problem, multi-level programming, uncertain variable, uncertain environment
Procedia PDF Downloads 832577 Wear Measuring and Wear Modelling Based On Archard, ASTM, and Neural Network Models
Authors: A. Shebani, C. Pislaru
Abstract:
Wear of materials is an everyday experience and has been observed and studied for long time. The prediction of wear is a fundamental problem in the industrial field, mainly correlated to the planning of maintenance interventions and economy. Pin-on-disc test is the most common test which is used to study the wear behaviour. In this paper, the pin-on-disc (AEROTECH UNIDEX 11) is used for the investigation of the effects of normal load and hardness of material on the wear under dry and sliding conditions. In the pin-on-disc rig, two specimens were used; one, a pin which is made of steel with a tip, is positioned perpendicular to the disc, where the disc is made of aluminium. The pin wear and disc wear were measured by using the following instruments: The Talysurf instrument, a digital microscope, and the alicona instrument; where the Talysurf profilometer was used to measure the pin/disc wear scar depth, and the alicona was used to measure the volume loss for pin and disc. After that, the Archard model, American Society for Testing and Materials model (ASTM), and neural network model were used for pin/disc wear modelling and the simulation results are implemented by using the Matlab program. This paper focuses on how the alicona can be considered as a powerful tool for wear measurements and how the neural network is an effective algorithm for wear estimation.Keywords: wear modelling, Archard Model, ASTM Model, Neural Networks Model, Pin-on-disc Test, Talysurf, digital microscope, Alicona
Procedia PDF Downloads 4562576 Financial Information and Collective Bargaining: Conflicting or Complementing
Authors: Humayun Murshed, Shibly Abdullah
Abstract:
The research conducted in early seventies apparently assumed the existence of a universal decision model for union negotiators and furthermore tended to regard financial information as a ‘neutral’ input into a rational decision-making process. However, research in the eighties began to question the neutrality of financial information as an input in collective bargaining rather viewing it as a potentially effective means for controlling the labour force. Furthermore, this later research also started challenging the simplistic assumptions relating particularly to union objectives which have underpinned the earlier search for universal union decision models. Despite the above developments there seems to be a dearth of studies in developing countries concerning the use of financial information in collective bargaining. This paper seeks to begin to remedy this deficiency. Utilising a case study approach based on two enterprises, one in the public sector and the other a multinational, the universal decision model is rejected and it is argued that the decision whether or not to use financial information is a contingent one and such a contingency is largely defined by the context and environment in which both union and management negotiators work. An attempt is also made to identify the factors constraining as well as promoting the use of financial information in collective bargaining, these being regarded as unique to the organizations within which the case studies are conducted.Keywords: collective bargaining, developing countries, disclosures, financial information
Procedia PDF Downloads 471