Search results for: code mixing
576 A Simple Olfactometer for Odour and Lateralization Thresholds of Chemical Vapours
Authors: Lena Ernstgård, Aishwarya M. Dwivedi, Johan Lundström, Gunnar Johanson
Abstract:
A simple inexpensive olfactometer was constructed to enable valid measures of detection threshold of low concentrations of vapours of chemicals. The delivery system consists of seven syringe pumps, each connected to a Tedlar bag containing a predefined concentration of the test chemical in the air. The seven pumps are connected to a 8-way mixing valve which in turn connects to a birhinal nose piece. Chemical vapor of known concentration is generated by injection of an appropriate amount of the test chemical into a Tedlar bag with a known volume of clean air. Complete vaporization is assured by gentle heating of the bag from the outside with a heat flow. The six test concentrations are obtained by adding different volumes from the starting bag to six new Tedlar bags with known volumes of clean air. One bag contains clean air only. Thus, six different test concentrations and clean air can easily be tested in series by shifting the valve to new positions. Initial in-line measurement with a photoionization detector showed that the delivery system quickly responded to a shift in valve position. Thus 90% of the desired concentration was reached within 15 seconds. The concentrations in the bags are verified daily by gas chromatography. The stability of the system in terms of chemical concentration is monitored in real time by means of a photo-ionization detector. To determine lateralization thresholds, an additional pump supplying clean air is added to the delivery system in a way so that the nostrils can be separately and interchangeably be exposed to clean air and test chemical. Odor and lateralization thresholds were determined for three aldehydes; acrolein, crotonaldehyde, and hexanal in 20 healthy naïve individuals. Aldehydes generally have a strong odour, and the selected aldehydes are also considered to be irritating to mucous membranes. The median odor thresholds of the three aldehydes were 0.017, 0.0008, and 0.097 ppm, respectively. No lateralization threshold could be identified for acrolein, whereas the medians for crotonaldehyde and hexanal were 0.003 and 0.39 ppm, respectively. In conclusion, we constructed a simple, inexpensive olfactometer that allows for stable and easily measurable concentrations of vapors of the test chemical. Our test with aldehydes demonstrates that the system produces valid detection among volunteers in terms of odour and lateralization thresholds.Keywords: irritation, odour delivery, olfactometer, smell
Procedia PDF Downloads 216575 Criminal Responsibility of Minors in Russia: The Age of Liability and Penalties
Authors: Natalia Selezneva
Abstract:
The level of crime depends on a number of factors, such as political and economic instability, social inequality and ineffective legislation. A special place in the overall level of crime takes juvenile delinquency. United Nations Standard Minimum developed rules for the administration of juvenile justice (The Beijing Rules), in order to ensure the rights of juvenile offenders under the various legal systems. Most countries support these recommendations, and Russia is no exception. Russia's criminal code establishes the minimum age of criminal liability; types of crimes for which the possible involvement of minors to justice; punishment; sentencing and execution of punishment for minors. However, these provisions cause heated debates in the scientific literature. The high level of juvenile crime indicates the ineffectiveness of legal regulation of criminal liability of minors. In order to ensure compliance with international standards require new and modern approaches to improve national legislation and practice of its application. Achieving this goal will be achieved through the following tasks: 1. Create sub-branches of law regulating the legal status of minors; 2. Improving the types of penalties; 3. The possibility of using alternative measures; 4. The introduction of the procedure of extrajudicial settlement of the conflict. The criminal law of each country depends on the historical, national and cultural characteristics. The development of the Russian legislation taking into account international experience is extremely essential and will be a new stage in the formation of a legal state, especially in the sphere of protection of the rights of juvenile offenders.Keywords: criminal law, juvenile offender, punishment, the age of criminal responsibility
Procedia PDF Downloads 543574 Using ICESat-2 Dynamic Ocean Topography to Estimate Western Arctic Freshwater Content
Authors: Joshua Adan Valdez, Shawn Gallaher
Abstract:
Global climate change has impacted atmospheric temperatures contributing to rising sea levels, decreasing sea ice, and increased freshening of high latitude oceans. This freshening has contributed to increased stratification inhibiting local mixing and nutrient transport, modifying regional circulations in polar oceans. In recent years, the Western Arctic has seen an increase in freshwater volume at an average rate of 397+-116km3/year across the Beaufort Gyre. The majority of the freshwater volume resides in the Beaufort Gyre surface lens driven by anticyclonic wind forcing, sea ice melt, and Arctic river runoff, and is typically defined as water fresher than 34.8. The near-isothermal nature of Arctic seawater and non-linearities in the equation of state for near-freezing waters result in a salinity-driven pycnocline as opposed to the temperature-driven density structure seen in the lower latitudes. In this study, we investigate the relationship between freshwater content and dynamic ocean topography (DOT). In situ measurements of freshwater content are useful in providing information on the freshening rate of the Beaufort Gyre; however, their collection is costly and time-consuming. Utilizing NASA’s ICESat-2’s DOT remote sensing capabilities and Air Expendable CTD (AXCTD) data from the Seasonal Ice Zone Reconnaissance Surveys (SIZRS), a linear regression model between DOT and freshwater content is determined along the 150° west meridian. Freshwater content is calculated by integrating the volume of water between the surface and a depth with a reference salinity of ~34.8. Using this model, we compare interannual variability in freshwater content within the gyre, which could provide a future predictive capability of freshwater volume changes in the Beaufort-Chukchi Sea using non-in situ methods. Successful employment of the ICESat-2’s DOT approximation of freshwater content could potentially demonstrate the value of remote sensing tools to reduce reliance on field deployment platforms to characterize physical ocean properties.Keywords: Cryosphere, remote sensing, Arctic oceanography, climate modeling, Ekman transport
Procedia PDF Downloads 77573 Security Analysis of Mod. S Transponder Technology and Attack Examples
Authors: M. Rutkowski, J. Cwiklak, M. Grzegorzewski, M. Adamski
Abstract:
All class A Airplanes have to be equipped with Mod. S transponder for ATC surveillance purposes. This technology was designed to provide a robust and dependable solution to localize, identify and exchange data with the airplane. The purpose of this paper is to analyze potential hazards that are a result of lack of any security or encryption on a design level. Secondary Surveillance Radars rely on an active response from an airplane. SSR radar installation is broadcasting a directional interrogation signal to the planes in range on 1030MHz frequency with DPSK modulation. If the interrogation is correctly received by the transponder located on the plane, a proper answer is sent on 1090MHz with PPM modulation containing plane’s SQUAWK, barometric altitude, GPS coordinates and 24bit unique address code. This technology does not use any kind of encryption. All of the specifications from the previous chapter can be found easily on the internet. Since there is no encryption or security measure to ensure the credibility of the sender and message, it is highly hazardous to use such technology to ensure the safety of the air traffic. The only thing that identifies the airplane is the 24-bit unique address. Most of the planes have been sniffed by aviation enthusiasts and cataloged in web databases. In the moment of writing this article, The PoFung Technologies has announced that they are planning to release all band SDR transceiver – this device would be more than enough to build your own Mod. S Transponder. With fake transponder, a potential terrorist can identify as a different airplane. By replacing the transponder in a poorly controlled airspace, hijackers can enter another airspace identifying themselves as another plane and land in the desired area.Keywords: flight safety, hijack, mod S transponder, security analysis
Procedia PDF Downloads 295572 Evaluation of Immune Checkpoint Inhibitors in Cancer Therapy
Authors: Mir Mohammad Reza Hosseini
Abstract:
In new years immune checkpoint inhibitors have gathered care as being one of the greatest talented kinds of immunotherapy on the prospect. There has been a specific emphasis on the immune checkpoint molecules, cytotoxic T-lymphocyte antigen-4 (CTLA-4) and programmed cell death protein 1 (PD-1). In 2011, ipilimumab, the primary antibody obstructive an immune checkpoint (CTLA4) was authorized. It is now documented that recognized tumors have many devices of overpowering the antitumor immune response, counting manufacture of repressive cytokines, staffing of immunosuppressive immune cells, and upregulation of coinhibitory receptors recognized as immune checkpoints. This was fast followed by the growth of monoclonal antibodies directing PD1 (pembrolizumab and nivolumab) and PDL1 (atezolizumab and durvalumab). Anti-PD1/PDL1 antibodies have developed some of the greatest extensively set anticancer therapies. We also compare and difference their present place in cancer therapy and designs of immune-related toxicities and deliberate the role of dual immune checkpoint inhibition and plans for the organization of immune-related opposing proceedings. In this review, the employed code and present growth of numerous immune checkpoint inhibitors are abridged, while the communicating device and new development of Immune checkpoint inhibitors in cancer therapy-based synergistic therapies with additional immunotherapy, chemotherapy, phototherapy, and radiotherapy in important and clinical educations in the historical 5 years are portrayed and tinted. Lastly, we disapprovingly measure these methods and effort to find their fortes and faintness based on pre-clinical and clinical information.Keywords: checkpoint, cancer therapy, PD-1, PDL-1, CTLA4, immunosuppressive
Procedia PDF Downloads 168571 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 34570 Thermo-Mechanical Behavior of Steel-Wood Connections of Wooden Structures Under the Effect of a Fire
Authors: Ahmed Alagha, Belkacem Lamri, Abdelhak Kada.
Abstract:
Steel-wood assemblies often have complex geometric configurations whose overall behavior under the effect of a fire is conditioned by the thermal response, by combining the two materials steel and wood, whose thermal characteristics are greatly influenced by high temperatures. The objective of this work is to study the thermal behavior of a steel-wood connection, with or without insulating material, subjected to an ISO834 standard fire model. The analysis is developed by the analytical approach using the Eurocode, and numerically, by the finite element method, through the ANSYS calculation code. The design of the connections is evaluated at room temperature taking the cases of single shear and double shear. The thermal behavior of the connections is simulated in transient state while taking into account the modes of heat transfer by convection and by radiation. The variation of temperature as a function of time is evaluated in different positions of the connections while talking about the heat produced and the formation of the carbon layer. The results relate to the temperature distributions in the connection elements as a function of the duration of the fire. The results of the thermal analysis show that the temperature increases rapidly and reaches more than 260 °C in the steel material for an hour of exposure to fire. The temperature development in wood material is different from that in steel because of its thermal properties. Wood heats up on the outside and burns, its surface can reach very high temperatures in points on the surface.Keywords: Eurocode 5, finite elements, ISO834, simple shear, thermal behaviour, wood-steel connection
Procedia PDF Downloads 86569 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts
Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti
Abstract:
Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization
Procedia PDF Downloads 63568 Optimization of Fused Deposition Modeling 3D Printing Process via Preprocess Calibration Routine Using Low-Cost Thermal Sensing
Authors: Raz Flieshman, Adam Michael Altenbuchner, Jörg Krüger
Abstract:
This paper presents an approach to optimizing the Fused Deposition Modeling (FDM) 3D printing process through a preprocess calibration routine of printing parameters. The core of this method involves the use of a low-cost thermal sensor capable of measuring tempera-tures within the range of -20 to 500 degrees Celsius for detailed process observation. The calibration process is conducted by printing a predetermined path while varying the process parameters through machine instructions (g-code). This enables the extraction of critical thermal, dimensional, and surface properties along the printed path. The calibration routine utilizes computer vision models to extract features and metrics from the thermal images, in-cluding temperature distribution, layer adhesion quality, surface roughness, and dimension-al accuracy and consistency. These extracted properties are then analyzed to optimize the process parameters to achieve the desired qualities of the printed material. A significant benefit of this calibration method is its potential to create printing parameter profiles for new polymer and composite materials, thereby enhancing the versatility and application range of FDM 3D printing. The proposed method demonstrates significant potential in enhancing the precision and reliability of FDM 3D printing, making it a valuable contribution to the field of additive manufacturing.Keywords: FDM 3D printing, preprocess calibration, thermal sensor, process optimization, additive manufacturing, computer vision, material profiles
Procedia PDF Downloads 40567 Impact of Audit Committee on Real Earnings Management: Cases of Netherlands
Authors: Sana Masmoudi Mardassi, Yosra Makni Fourati
Abstract:
Regulators highlight the importance of the Audit Committee (AC) as a key internal corporate governance mechanism. One of the most important roles of this committee is to oversee the financial reporting process. The purpose of this paper is to examine the link between the characteristics of an audit committee and the financial reporting quality by investigating whether the characteristics of audit committees are associated with improved financial reporting quality, especially the Real Earnings Management. In the current study, a panel data from 80 nonfinancial companies listed on the Amsterdam Stock Exchange during the period between 2010 and 2017 were used. To measure audit committee characteristics, four proxies have been used, specifically, audit committee independence, financial expertise, gender diversity and AC meetings. For this research, a linear regression model was used to identify the influence of a set of board characteristics of the audit committee on real earnings management after controlling for firm audit committee size, leverage, size, loss, growth and board size. This research provides empirical evidence of the association between audit committee independence, financial expertise, gender diversity and meetings and Real Earnings Management (REM) as a proxy of financial reporting quality. The study finds that independence and AC Gender diversity are strongly related to financial reporting quality. In fact, these two characteristics constrain REM. The results also suggest that AC- financial expertise reduces to some extent, the likelihood of engaging in REM. These conclusions provide support then to the audit committee requirement under the Dutch Corporate Governance Code rules regarding gender diversity and AC meetings.Keywords: audit committee, financial expertise, independence, real earnings management
Procedia PDF Downloads 167566 Earthquake Hazards in Manipur: Casual Factors and Remedial Measures
Authors: Kangujam Monika, Kiranbala Devi Thokchom, Soibam Sandhyarani Devi
Abstract:
Earthquake is a major natural hazard in India. Manipur, located in the North-Eastern Region of India, is one of the most affected location in the region prone to earthquakes since it lies in an area where Indian and Eurasian tectonic plates meet and is in seismic Zone V which is the most severe intensity zone, according to IS Code. Some recent earthquakes recorded in Manipur are M 6.7 epicenter at Tamenglong (January 4, 2016), M 5.2 epicenter at Churachandpur (February 24, 2017) and most recent M 4.4 epicenter at Thoubal (June 19, 2017). In these recent earthquakes, some houses and buildings were damaged, landslides were also occurred. A field study was carried out. An overview of the various causal factors involved in triggering of earthquake in Manipur has been discussed. It is found that improper planning, poor design, negligence, structural irregularities, poor quality materials, construction of foundation without proper site soil investigation and non-implementation of remedial measures, etc., are possibly the main causal factors for damage in Manipur during earthquake. The study also suggests, though the proper design of structure and foundation along with soil investigation, ground improvement methods, use of modern techniques of construction, counseling with engineer, mass awareness, etc., might be effective solution to control the hazard in many locations. An overview on the analysis pertaining to earthquake in Manipur together with on-going detailed site specific geotechnical investigation were presented.Keywords: Manipur, earthquake, hazard, structure, soil
Procedia PDF Downloads 210565 Numerical Investigation of Indoor Environmental Quality in a Room Heated with Impinging Jet Ventilation
Authors: Mathias Cehlin, Arman Ameen, Ulf Larsson, Taghi Karimipanah
Abstract:
The indoor environmental quality (IEQ) is increasingly recognized as a significant factor influencing the overall level of building occupants’ health, comfort and productivity. An air-conditioning and ventilation system is normally used to create and maintain good thermal comfort and indoor air quality. Providing occupant thermal comfort and well-being with minimized use of energy is the main purpose of heating, ventilating and air conditioning system. Among different types of ventilation systems, the most widely known and used ventilation systems are mixing ventilation (MV) and displacement ventilation (DV). Impinging jet ventilation (IJV) is a promising ventilation strategy developed in the beginning of 2000s. IJV has the advantage of supplying air downwards close to the floor with high momentum and thereby delivering fresh air further out in the room compare to DV. Operating in cooling mode, IJV systems can have higher ventilation effectiveness and heat removal effectiveness compared to MV, and therefore a higher energy efficiency. However, how is the performance of IJV when operating in heating mode? This paper presents the function of IJV in a typical office room for winter conditions (heating mode). In this paper, a validated CFD model, which uses the v2-f model is used for the prediction of air flow pattern, thermal comfort and air change effectiveness. The office room under consideration has the dimensions 4.2×3.6×2.5m, which can be designed like a single-person or two-person office. A number of important factors influencing in the room with IJV are studied. The considered parameters are: heating demand, number of occupants and supplied air conditions. A total of 6 simulation cases are carried out to investigate the effects of the considered parameters. Heat load in the room is contributed by occupants, computer and lighting. The model consists of one external wall including a window. The interaction effects of heat sources, supply air flow and down draught from the window result in a complex flow phenomenon. Preliminary results indicate that IJV can be used for heating of a typical office room. The IEQ seems to be suitable in the occupied region for the studied cases.Keywords: computation fluid dynamics, impinging jet ventilation, indoor environmental quality, ventilation strategy
Procedia PDF Downloads 180564 Biological Regulation of Endogenous Enzymatic Activity of Rainbow Trout (Oncorhynchus Mykiss) with Protease Inhibitors Chickpea in Model Systems
Authors: Delgado-Meza M., Minor-Pérez H.
Abstract:
Protease is the generic name of enzymes that hydrolyze proteins. These are classified in the subgroup EC3.4.11-99X of the classification enzymes. In food technology the proteolysis is used to modify functional and nutritional properties of food, and in some cases this proteolysis may cause food spoilage. In general, seafood and rainbow trout have accelerated decomposition process once it has done its capture, due to various factors such as the endogenous enzymatic activity that can result in loss of structure, shape and firmness, besides the release of amino acid precursors of biogenic amines. Some studies suggest the use of protease inhibitors from legume as biological regulators of proteolytic activity. The enzyme inhibitors are any substance that reduces the rate of a reaction catalyzed by an enzyme. The objective of this study was to evaluate the reduction of the proteolytic activity of enzymes in extracts of rainbow trout with protease inhibitors obtained from chickpea flour. Different proportions of rainbow trout enzyme extract (75%, 50% and 25%) and extract chickpea enzyme inhibitors were evaluated. Chickpea inhibitors were obtained by mixing 5 g of flour in 30 mL of pH 7.0 phosphate buffer. The sample was centrifuged at 8000 rpm for 10 min. The supernatant was stored at -15°C. Likewise, 20 g of rainbow trout were ground in 20 mL of phosphate buffer solution at pH 7.0 and the mixture was centrifuged at 5000 rpm for 20 min. The supernatant was used for the study. In each treatment was determined the specific enzymatic activity with the technique of Kunitz, using hemoglobin as substrate for the enzymes acid fraction and casein for basic enzymes. Also biuret protein was quantified for each treatment. The results showed for fraction of basic enzymes in the treatments evaluated, that were inhibition of endogenous enzymatic activity. Inhibition values compared to control were 51.05%, 56.59% and 59.29% when the proportions of endogenous enzymes extract rainbow trout were 75%, 50% and 25% and the remaining volume used was extract with inhibitors. Treatments with acid enzymes showed no reduction in enzyme activity. In conclusion chickpea flour reduced the endogenous enzymatic activity of rainbow trout, which may favor its application to increase the half-life of this food. The authors acknowledge the funding provided by the CONACYT for the project 131998.Keywords: rainbouw trout, enzyme inhibitors, proteolysis, enzyme activity
Procedia PDF Downloads 423563 Design, Construction and Evaluation of a Mechanical Vapor Compression Distillation System for Wastewater Treatment in a Poultry Company
Authors: Juan S. Vera, Miguel A. Gomez, Omar Gelvez
Abstract:
Water is Earth's most valuable resource, and the lack of it is currently a critical problem in today’s society. Non-treated wastewaters contribute to this situation, especially those coming from industrial activities, as they reduce the quality of the water bodies, annihilating all kind of life and bringing disease to people in contact with them. An effective solution for this problem is distillation, which removes most contaminants. However, this approach must also be energetically efficient in order to appeal to the industry. In this endeavour, most water distillation treatments fail, with the exception of the Mechanical Vapor Compression (MVC) distillation system, which has a great efficiency due to energy input by a compressor and the latent heat exchange. This paper presents the process of design, construction, and evaluation of a Mechanical Vapor Compression (MVC) distillation system for the main Colombian poultry company Avidesa Macpollo SA. The system will be located in the principal slaughterhouse in the state of Santander, and it will work along with the Gas Energy Mixing system (GEM) to treat the wastewaters from the plant. The main goal of the MVC distiller, rarely used in this type of application, is to reduce the chlorides, Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD) levels according to the state regulations since the GEM cannot decrease them enough. The MVC distillation system works with three components, the evaporator/condenser heat exchanger where the distillation takes place, a low-pressure compressor which gives the energy to create the temperature differential between the evaporator and condenser cavities and a preheater to save the remaining energy in the distillate. The model equations used to describe how the compressor power consumption, heat exchange area and distilled water are related is based on a thermodynamic balance and heat transfer analysis, with correlations taken from the literature. Finally, the design calculations and the measurements of the installation are compared, showing accordance with the predictions in distillate production and power consumption, changing the temperature difference of the evaporator/condenser.Keywords: mechanical vapor compression, distillation, wastewater, design, construction, evaluation
Procedia PDF Downloads 159562 The Culture of Journal Writing among Manobo Senior High School Students
Authors: Jessevel Montes
Abstract:
This study explored on the culture of journal writing among the Senior High School Manobo students. The purpose of this qualitative morpho-semantic and syntactic study was to discover the morphological, semantic, and syntactic features of the written output through morphological, semantic, and syntactic categories present in their journal writings. Also, beliefs and practices embedded in the norms, values, and ideologies were identified. The study was conducted among the Manobo students in the Senior High Schools of Central Mindanao, particularly in the Division of North Cotabato. Findings revealed that morphologically, the features that flourished are the following: subject-verb concordance, tenses, pronouns, prepositions, articles, and the use of adjectives. Semantically, the features are the following: word choice, idiomatic expression, borrowing, and vernacular. Syntactically, the features are the types of sentences according to structure and function; and the dominance of code switching and run-on sentences. Lastly, as to the beliefs and practices embedded in the norms, values, and ideologies of their journal writing, the major themes are: valuing education, family, and friends as treasure, preservation of culture, and emancipation from the bondage of poverty. This study has shed light on the writing capabilities and weaknesses of the Manobo students when it comes to English language. Further, such an insight into language learning problems is useful to teachers because it provides information on common trouble-spots in language learning, which can be used in the preparation of effective teaching materials.Keywords: applied linguistics, culture, morpho-semantic and syntactic analysis, Manobo Senior High School, Philippines
Procedia PDF Downloads 121561 Investigation of Turbulent Flow in a Bubble Column Photobioreactor and Consequent Effects on Microalgae Cultivation Using Computational Fluid Dynamic Simulation
Authors: Geetanjali Yadav, Arpit Mishra, Parthsarathi Ghosh, Ramkrishna Sen
Abstract:
The world is facing problems of increasing global CO2 emissions, climate change and fuel crisis. Therefore, several renewable and sustainable energy alternatives should be investigated to replace non-renewable fuels in future. Algae presents itself a versatile feedstock for the production of variety of fuels (biodiesel, bioethanol, bio-hydrogen etc.) and high value compounds for food, fodder, cosmetics and pharmaceuticals. Microalgae are simple microorganisms that require water, light, CO2 and nutrients for growth by the process of photosynthesis and can grow in extreme environments, utilize waste gas (flue gas) and waste waters. Mixing, however, is a crucial parameter within the culture system for the uniform distribution of light, nutrients and gaseous exchange in addition to preventing settling/sedimentation, creation of dark zones etc. The overarching goal of the present study is to improve photobioreactor (PBR) design for enhancing dissolution of CO2 from ambient air (0.039%, v/v), pure CO2 and coal-fired flue gas (10 ± 2%) into microalgal PBRs. Computational fluid dynamics (CFD), a state-of-the-art technique has been used to solve partial differential equations with turbulence closure which represents the dynamics of fluid in a photobioreactor. In this paper, the hydrodynamic performance of the PBR has been characterized and compared with that of the conventional bubble column PBR using CFD. Parameters such as flow rate (Q), mean velocity (u), mean turbulent kinetic energy (TKE) were characterized for each experiment that was tested across different aeration schemes. The results showed that the modified PBR design had superior liquid circulation properties and gas-liquid transfer that resulted in creation of uniform environment inside PBR as compared to conventional bubble column PBR. The CFD technique has shown to be promising to successfully design and paves path for a future research in order to develop PBRs which can be commercially available for scale-up microalgal production.Keywords: computational fluid dynamics, microalgae, bubble column photbioreactor, flue gas, simulation
Procedia PDF Downloads 231560 Modelling Interactions between Saturated and Unsaturated Zones by Hydrus 1D, Plain of Kairouan, Central Tunisia
Authors: Mariem Saadi, Sabri Kanzari, Adel Zghibi
Abstract:
In semi-arid areas like the Kairouan region, the constant irrigation with saline water and the overuse of groundwater resources, soils and aquifers salinization has become an increasing concern. In this study, a methodology has been developed to evaluate the groundwater contamination risk based on the unsaturated zone hydraulic properties. Two soil profiles with different ranges of salinity, one located in the north of the plain and another one in the south of plain (each 30 m deep) and both characterized by direct recharge of the aquifer were chosen. Simulations were conducted with Hydrus-1D code using measured precipitation data for the period 1998-2003 and calculated evapotranspiration for both chosen profiles. Four combinations of initial conditions of water content and salt concentration were used for the simulation process in order to find the best match between simulated and measured values. The success of the calibration of Hydrus-1D allowed the investigation of some scenarios in order to assess the contamination risk under different natural conditions. The aquifer risk contamination is related to the natural conditions where it increased while facing climate change and temperature increase and decreased in the presence of a clay layer in the unsaturated zone. Hydrus-1D was a useful tool to predict the groundwater level and quality in the case of a direct recharge and in the absence of any information related to the soil layers except for the texture.Keywords: Hydrus-1D, Kairouan, salinization, semi-arid region, solute transport, unsaturated zone
Procedia PDF Downloads 183559 The Formulation of the Mecelle and Other Codified Laws in the Ottoman Empire: Transformation Overturning the Sharia Principles
Authors: Tianqi Yin
Abstract:
The sharia had been the legislative basis in the Ottoman Empire since its emergence. The authority of sharia was superlative in the Islamic society compared to the power of the sulta, the nominal ruler of the nation, regulating essentially every aspect of people’s lives according to an ethical code. In modernity, however, as European sovereignty employed forces to re-engineer the Islamic world to make it more like their own, a society ruled by a state, the Ottoman legislation system encountered a great challenge of adopting codified laws to replace sharia with the formulation of the Mecelle being a prominent case. Interpretations of this transformation have been contentious, with the key debate revolving around whether these codified laws are authentic representations of sharia or alien legal formulations authorized by the modern nation-state under heavy European colonial influence. Because of the difference in methodology of the diverse theories, challenges toward having a universal conclusion on this issue remain. This paper argues that the formulation of the Mecelle and other codified laws is a discontinuity of sharia due to European modernity’s influence and that the emphasis on elements of Islamic laws is a tactic employed to promote this process. These codified laws signals a complete social transformation from the Islamic society ruled by the sharia to a replication of the European society that is ruled by a comprehensive ruling system of the modern state. In addition to advancing the discussion on the characterization of the codification movement in the Ottoman Empire in modernity, the research also promotes the determination of the nature of the modern codification movement globally.Keywords: codification, mecelle, modernity, sharia, ottoman empire
Procedia PDF Downloads 91558 Timber Urbanism: Assessing the Carbon Footprint of Mass-Timber, Steel, and Concrete Structural Prototypes for Peri-Urban Densification in the Hudson Valley’s Urban Fringe
Authors: Eleni Stefania Kalapoda
Abstract:
The current fossil-fuel based urbanization pattern and the estimated human population growth are increasing the environmental footprint on our planet’s precious resources. To mitigate the estimated skyrocketing in greenhouse gas emissions associated with the construction of new cities and infrastructure over the next 50 years, we need a radical rethink in our approach to construction to deliver a net zero built environment. This paper assesses the carbon footprint of a mass-timber, a steel, and a concrete structural alternative for peri-urban densification in the Hudson Valley's urban fringe, along with examining the updated policy and the building code adjustments that support synergies between timber construction in city making and sustainable management of timber forests. By quantifying the carbon footprint of a structural prototype for four different material assemblies—a concrete (post-tensioned), a mass timber, a steel (composite), and a hybrid (timber/steel/concrete) assembly applicable to the three updated building typologies of the IBC 2021 (Type IV-A, Type IV-B, Type IV-C) that range between a nine to eighteen-story structure alternative—and scaling-up that structural prototype to the size of a neighborhood district, the paper presents a quantitative and a qualitative approach for a forest-based construction economy as well as a resilient and a more just supply chain framework that ensures the wellbeing of both the forest and its inhabitants.Keywords: mass-timber innovation, concrete structure, carbon footprint, densification
Procedia PDF Downloads 108557 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 301556 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor
Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira
Abstract:
The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis
Procedia PDF Downloads 75555 Performance and Specific Emissions of an SI Engine Using Anhydrous Ethanol–Gasoline Blends in the City of Bogota
Authors: Alexander García Mariaca, Rodrigo Morillo Castaño, Juan Rolón Ríos
Abstract:
The government of Colombia has promoted the use of biofuels in the last 20 years through laws and resolutions, which regulate their use, with the objective to improve the atmospheric air quality and to promote Colombian agricultural industry. However, despite the use of blends of biofuels with fossil fuels, the air quality in large cities does not get better, this deterioration in the air is mainly caused by mobile sources that working with spark ignition internal combustion engines (SI-ICE), operating with a mixture in volume of 90 % gasoline and 10 % ethanol called E10, that for the case of Bogota represent 84 % of the fleet. Another problem is that Colombia has big cities located above 2200 masl and there are no accurate studies on the impact that the E10 mixture could cause in the emissions and performance of SI-ICE. This study aims to establish the optimal blend between gasoline ethanol in which an SI engine operates more efficiently in urban centres located at 2600 masl. The test was developed on SI engine four-stroke, single cylinder, naturally aspirated and with carburettor for the fuel supply using blends of gasoline and anhydrous ethanol in different ratios E10, E15, E20, E40, E60, E85 and E100. These tests were conducted in the city of Bogota, which is located at 2600 masl, with the engine operating at 3600 rpm and at 25, 50, 75 and 100% of load. The results show that the performance variables as engine brake torque, brake power and brake thermal efficiency decrease, while brake specific fuel consumption increases with the rise in the percentage of ethanol in the mixture. On the other hand, the specific emissions of CO2 and NOx present increases while specific emissions of CO and HC decreases compared to those produced by gasoline. From the tests, it is concluded that the SI-ICE worked more efficiently with the E40 mixture, where was obtained an increases of the brake power of 8.81 % and a reduction on brake specific fuel consumption of 2.5 %, coupled with a reduction in the specific emissions of CO2, HC and CO in 9.72, 52.88 and 76.66 % respectively compared to the results obtained with the E10 blend. This behaviour is because the E40 mixture provides the appropriate amount of the oxygen for the combustion process, which leads to better utilization of available energy in this process, thus generating a comparable power output to the E10 mixing and producing lower emissions CO and HC with the other test blends. Nevertheless, the emission of NOx increases in 106.25 %.Keywords: emissions, ethanol, gasoline, engine, performance
Procedia PDF Downloads 323554 International Classification of Primary Care as a Reference for Coding the Demand for Care in Primary Health Care
Authors: Souhir Chelly, Chahida Harizi, Aicha Hechaichi, Sihem Aissaoui, Leila Ben Ayed, Maha Bergaoui, Mohamed Kouni Chahed
Abstract:
Introduction: The International Classification of Primary Care (ICPC) is part of the morbidity classification system. It had 17 chapters, and each is coded by an alphanumeric code: the letter corresponds to the chapter, the number to a paragraph in the chapter. The objective of this study is to show the utility of this classification in the coding of the reasons for demand for care in Primary health care (PHC), its advantages and limits. Methods: This is a cross-sectional descriptive study conducted in 4 PHC in Ariana district. Data on the demand for care during 2 days in the same week were collected. The coding of the information was done according to the CISP. The data was entered and analyzed by the EPI Info 7 software. Results: A total of 523 demands for care were investigated. The patients who came for the consultation are predominantly female (62.72%). Most of the consultants are young with an average age of 35 ± 26 years. In the ICPC, there are 7 rubrics: 'infections' is the most common reason with 49.9%, 'other diagnoses' with 40.2%, 'symptoms and complaints' with 5.5%, 'trauma' with 2.1%, 'procedures' with 2.1% and 'neoplasm' with 0.3%. The main advantage of the ICPC is the fact of being a standardized tool. It is very suitable for classification of the reasons for demand for care in PHC according to their specificity, capacity to be used in a computerized medical file of the PHC. Its current limitations are related to the difficulty of classification of some reasons for demand for care. Conclusion: The ICPC has been developed to provide healthcare with a coding reference that takes into account their specificity. The CIM is in its 10th revision; it would gain from revision to revision to be more efficient to be generalized and used by the teams of PHC.Keywords: international classification of primary care, medical file, primary health care, Tunisia
Procedia PDF Downloads 267553 Numerical Investigation of Pressure Drop and Erosion Wear by Computational Fluid Dynamics Simulation
Authors: Praveen Kumar, Nitin Kumar, Hemant Kumar
Abstract:
The modernization of computer technology and commercial computational fluid dynamic (CFD) simulation has given better detailed results as compared to experimental investigation techniques. CFD techniques are widely used in different field due to its flexibility and performance. Evaluation of pipeline erosion is complex phenomenon to solve by numerical arithmetic technique, whereas CFD simulation is an easy tool to resolve that type of problem. Erosion wear behaviour due to solid–liquid mixture in the slurry pipeline has been investigated using commercial CFD code in FLUENT. Multi-phase Euler-Lagrange model was adopted to predict the solid particle erosion wear in 22.5° pipe bend for the flow of bottom ash-water suspension. The present study addresses erosion prediction in three dimensional 22.5° pipe bend for two-phase (solid and liquid) flow using finite volume method with standard k-ε turbulence, discrete phase model and evaluation of erosion wear rate with varying velocity 2-4 m/s. The result shows that velocity of solid-liquid mixture found to be highly dominating parameter as compared to solid concentration, density, and particle size. At low velocity, settling takes place in the pipe bend due to low inertia and gravitational effect on solid particulate which leads to high erosion at bottom side of pipeline.Keywords: computational fluid dynamics (CFD), erosion, slurry transportation, k-ε Model
Procedia PDF Downloads 408552 Perforation Analysis of the Aluminum Alloy Sheets Subjected to High Rate of Loading and Heated Using Thermal Chamber: Experimental and Numerical Approach
Authors: A. Bendarma, T. Jankowiak, A. Rusinek, T. Lodygowski, M. Klósak, S. Bouslikhane
Abstract:
The analysis of the mechanical characteristics and dynamic behavior of aluminum alloy sheet due to perforation tests based on the experimental tests coupled with the numerical simulation is presented. The impact problems (penetration and perforation) of the metallic plates have been of interest for a long time. Experimental, analytical as well as numerical studies have been carried out to analyze in details the perforation process. Based on these approaches, the ballistic properties of the material have been studied. The initial and residual velocities laser sensor is used during experiments to obtain the ballistic curve and the ballistic limit. The energy balance is also reported together with the energy absorbed by the aluminum including the ballistic curve and ballistic limit. The high speed camera helps to estimate the failure time and to calculate the impact force. A wide range of initial impact velocities from 40 up to 180 m/s has been covered during the tests. The mass of the conical nose shaped projectile is 28 g, its diameter is 12 mm, and the thickness of the aluminum sheet is equal to 1.0 mm. The ABAQUS/Explicit finite element code has been used to simulate the perforation processes. The comparison of the ballistic curve was obtained numerically and was verified experimentally, and the failure patterns are presented using the optimal mesh densities which provide the stability of the results. A good agreement of the numerical and experimental results is observed.Keywords: aluminum alloy, ballistic behavior, failure criterion, numerical simulation
Procedia PDF Downloads 312551 Utilizing Minecraft Java Edition for the Application of Fire Disaster Procedures to Establish Fire Disaster Readiness for Grade 12 STEM students of DLSU-IS
Authors: Aravella Flores, Jose Rafael E. Sotelo, Luis Romulus Phillippe R. Javier, Josh Christian V. Nunez
Abstract:
This study focuses on analyzing the performance of Grade 12 STEM students of De La Salle University - Integrated School that has completed the Disaster Readiness and Risk Reduction course in handling fire hazards through Minecraft Java Edition. This platform is suitable because fire DRRR is challenging to learn in a practical setting as well as questionable with regard to supplementing the successful implementation of textbook knowledge into actual practice. The purpose of this study is to acknowledge whether Minecraft can be a suitable environment to familiarize oneself to fire DRRR. The objectives are achieved through utilizing Minecraft in simulating fire scenarios which allows the participants to freely act upon and practice fire DRRR. The experiment was divided into the grounding and validation phase, where researchers observed the performance of the participants in the simulation. A pre-simulation and post-simulation survey was given to acknowledge the change in participants’ perception of being able to utilize fire DRRR procedures and their vulnerabilities. The paired t-test was utilized, showing significant differences in the pre-simulation and post-simulation survey scores, thus, insinuating improved judgment of DRRR, lessening their vulnerabilities in the possibility of encountering a fire hazard. This research poses a model for future research which can gather more participants and dwell on more complex codes outside just command blocks and into the code lines of Minecraft itself.Keywords: minecraft, DRRR, fire, disaster, simulation
Procedia PDF Downloads 137550 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 349549 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations
Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar
Abstract:
Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.Keywords: cache behaviour, network-on-chip, performance profiling, vectorization
Procedia PDF Downloads 197548 Using MALDI-TOF MS to Detect Environmental Microplastics (Polyethylene, Polyethylene Terephthalate, and Polystyrene) within a Simulated Tissue Sample
Authors: Kara J. Coffman-Rea, Karen E. Samonds
Abstract:
Microplastic pollution is an urgent global threat to our planet and human health. Microplastic particles have been detected within our food, water, and atmosphere, and found within the human stool, placenta, and lung tissue. However, most spectrometric microplastic detection methods require chemical digestion which can alter or destroy microplastic particles and makes it impossible to acquire information about their in-situ distribution. MALDI TOF MS (Matrix-assisted laser desorption ionization-time of flight mass spectrometry) is an analytical method using a soft ionization technique that can be used for polymer analysis. This method provides a valuable opportunity to both acquire information regarding the in-situ distribution of microplastics and also minimizes the destructive element of chemical digestion. In addition, MALDI TOF MS allows for expanded analysis of the microplastics including detection of specific additives that may be present within them. MALDI TOF MS is particularly sensitive to sample preparation and has not yet been used to analyze environmental microplastics within their specific location (e.g., biological tissues, sediment, water). In this study, microplastics were created using polyethylene gloves, polystyrene micro-foam, and polyethylene terephthalate cable sleeving. Plastics were frozen using liquid nitrogen and ground to obtain small fragments. An artificial tissue was created using a cellulose sponge as scaffolding coated with a MaxGel Extracellular Matrix to simulate human lung tissue. Optimal preparation techniques (e.g., matrix, cationization reagent, solvent, mixing ratio, laser intensity) were first established for each specific polymer type. The artificial tissue sample was subsequently spiked with microplastics, and specific polymers were detected using MALDI-TOF-MS. This study presents a novel method for the detection of environmental polyethylene, polyethylene terephthalate, and polystyrene microplastics within a complex sample. Results of this study provide an effective method that can be used in future microplastics research and can aid in determining the potential threats to environmental and human health that they pose.Keywords: environmental plastic pollution, MALDI-TOF MS, microplastics, polymer identification
Procedia PDF Downloads 256547 The Influence of Immunity on the Behavior and Dignity of Judges
Authors: D. Avnieli
Abstract:
Immunity of judges from liability represents a departure from the principle that all are equal under the law, and that victims may be granted compensation from their offenders. The purpose of the study is to determine if judicial immunity coincides with the need to ensure the existence of highly independent and incorruptible judiciary. Judges are immune from civil and criminal liability for their judicial acts. Judicial immunity is justified by the need to maintain complete independence and discretion of the judiciary. Scholars and judges believe that absolute immunity is needed to shield judges from pressures, threats, or outside interference. It is commonly accepted, that judges should be free to perform their judicial role in accordance with their assessment of the fact and their understanding of the law, without any restrictions, influences, inducements or interferences. In most countries, immunity applies when judges act in excess of jurisdiction. In some countries, it applies even when they act maliciously or corruptly. The only exception to absolute immunity applicable in all judicial systems is when judges act without jurisdiction over the subject matter. The Israeli Supreme Court recently decided to embrace absolute immunity and strike off a lawsuit of a refugee, who was unlawfully incarcerated. The Court ruled that the plaintiff cannot sue the State or the judge for damages. The questions of malice, dignity, and public scrutiny were not discussed. This paper, based on comparative analysis of many cases, aims to determine if immunity affects the dignity and behavior of judges. It demonstrates that most judges maintain their dignity and ethical code of behavior, but sometimes do not hesitate to act consciously in excess of jurisdiction, and in rare cases even corruptly. Therefore, in order to maintain independent and incorruptible judiciary, immunity should not be applied where judges act consciously in excess of jurisdiction or with malicious incentives.Keywords: incorruptible judiciary, immunity, independent, judicial, judges, jurisdiction
Procedia PDF Downloads 105