Search results for: technological evolution in defect analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30688

Search results for: technological evolution in defect analysis

25168 Capability Prediction of Machining Processes Based on Uncertainty Analysis

Authors: Hamed Afrasiab, Saeed Khodaygan

Abstract:

Prediction of machining process capability in the design stage plays a key role to reach the precision design and manufacturing of mechanical products. Inaccuracies in machining process lead to errors in position and orientation of machined features on the part, and strongly affect the process capability in the final quality of the product. In this paper, an efficient systematic approach is given to investigate the machining errors to predict the manufacturing errors of the parts and capability prediction of corresponding machining processes. A mathematical formulation of fixture locators modeling is presented to establish the relationship between the part errors and the related sources. Based on this method, the final machining errors of the part can be accurately estimated by relating them to the combined dimensional and geometric tolerances of the workpiece – fixture system. This method is developed for uncertainty analysis based on the Worst Case and statistical approaches. The application of the presented method is illustrated through presenting an example and the computational results are compared with the Monte Carlo simulation results.

Keywords: process capability, machining error, dimensional and geometrical tolerances, uncertainty analysis

Procedia PDF Downloads 311
25167 Experimental Analysis of Advanced Multi-Axial Preforms Conformability to Complex Contours

Authors: Andrew Hardman, Alistair T. McIlhagger, Edward Archer

Abstract:

A degree of research has been undertaken in the determination of 3D textile preforms behaviour to compression with direct comparison to 2D counterparts. Multiscale simulations have been developed to try and accurately analyse the behaviour of varying architectures post-consolidation. However, further understanding is required to experimentally identify the mechanisms and deformations that exist upon conforming to a complex contour. Due to the complexity of 3D textile preforms, determination of yarn behaviour to a complex contour is assessed through consolidation by means of vacuum assisted resin transfer moulding (VARTM), and the resulting mechanisms are investigated by micrograph analysis. Varying architectures; with known areal densities, pic density and thicknesses are assessed for a cohesive study. The resulting performance of each is assessed qualitatively as well as quantitatively from the perspective of material in terms of the change in representative unit cell (RVE) across the curved beam contour, in crimp percentage, tow angle, resin rich areas and binder distortion. A novel textile is developed from the resulting analysis to overcome the observed deformations.

Keywords: comformability, compression, binder architecture, 3D weaving, textile preform

Procedia PDF Downloads 170
25166 NMR-Based Metabolomics Reveals Dietary Effects in Liver Extracts of Arctic Charr (Salvelinus alpinus) and Tilapia (Oreochromis mossambicus) Fed Different Levels of Starch

Authors: Rani Abro, Ali Ata Moazzami, Jan Erik Lindberg, Torbjörn Lundh

Abstract:

The effect of dietary starch level on liver metabolism in Arctic charr (Salvelinus alpinus) and tilapia (Oreochromis mossambicus) was studied using 1H-NMR based metabolomics. Fingerlings were fed iso-nitrogenous diets containing 0, 10 and 20 % starch for two months before liver samples were collected for metabolite analysis. Metabolite profiling was performed using 600 MHz NMR Chenomx software. In total, 48 metabolites were profiled in liver extracts from both fish species. Following the profiling, principal component analysis (PCA) and orthogonal partial least square discriminant analysis (OPLC-DA) were performed. These revealed that differences in the concentration of significant metabolites were correlated to the dietary starch level in both species. The most prominent difference in metabolic response to starch feeding between the omnivorous tilapia and the carnivorous Arctic charr was an indication of higher anaerobic metabolism in Arctic charr. The data also indicated that amino acid and pyrimidine metabolism was higher in Artic charr than in tilapia.

Keywords: arctic charr, metabolomics, starch, tilapia

Procedia PDF Downloads 460
25165 The Effect of Relocating a Red Deer Stag on the Size of Its Home Range and Activity

Authors: Erika Csanyi, Gyula Sandor

Abstract:

In the course of the examination, we sought to answer the question of how and to what extent the home range and daily activity of a deer stag relocated from its habitual surroundings changes. We conducted the examination in two hunting areas in Hungary, about 50 km from one another. The control area was in the north of Somogy County, while the sample area was an area of similar features in terms of forest cover, tree stock, agricultural structure, altitude above sea level, climate, etc. in the south of Somogy County. Three middle-aged red deer stags were captured with rocket nets, immobilized and marked with GPS-Plus Collars manufactured by Vectronic Aerospace Gesellschaft mit beschränkter Haftung. One captured species was relocated. We monitored deer movements over 24-hour periods at 3 months. In the course of the examination, we analysed the behaviour of the relocated species and those that remained in their original habitat, as well as the temporal evolution of their behaviour. We examined the characteristics of the marked species’ daily activities and the hourly distance they covered. We intended to find out the difference between the behaviour of the species remaining in their original habitat and of those relocated to a more distant, but similar habitat. In summary, based on our findings, it can be established that such enforced relocations to a different habitat (e.g., game relocation) significantly increases the home range of the species in the months following relocation. Home ranges were calculated using the full data set and the minimum convex polygon (MCP) method. Relocation did not increase the nocturnal and diurnal movement activity of the animal in question. Our research found that the home range of the relocated species proved to be significantly higher than that of those species that were not relocated. The results have been presented in tabular form and have also been displayed on a map. Based on the results, it can be established that relocation inherently includes the risk of falling victim to poaching, vehicle collision. It was only in the third month following relocation that the home range of the relocated species subsided to the level of those species that were not relocated. It is advisable to take these observations into consideration in relocating red deer for nature conservation or game management purposes.

Keywords: Cervus elaphus, home range, relocation, red deer stag

Procedia PDF Downloads 141
25164 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 394
25163 Meta-analysis of Technology Acceptance for Mobile and Digital Libraries in Academic Settings

Authors: Nosheen Fatima Warraich

Abstract:

One of the most often used models in information system (IS) research is the technology acceptance model (TAM). This meta-analysis aims to measure the relationship between TAM variables, Perceived Ease of Use (PEOU), and Perceived Usefulness (PU) with users’ attitudes and behavioral intention (BI) in mobile and digital libraries context. It also examines the relationship of external variables (information quality and system quality) with TAM variables (PEOU and PU) in digital libraries settings. This meta-analysis was performed through PRISMA-P guidelines. Four databases (Google Scholar, Web of Science, Scopus, and LISTA) were utilized for searching, and the search was conducted according to defined criteria. The findings of this study revealed a large effect size of PU and PEOU with BI. There was also a large effect size of PU and PEOU with attitude. A medium effect size was found between SysQ -> PU, InfoQ-> PU, and SysQ -> PEOU. However, there was a small effect size between InfoQ and PEOU. It fills the literature gap and also confirms that TAM is a valid model for the acceptance and use of technology in mobile and digital libraries context. Thus, its findings would be helpful for developers and designers in designing and developing mobile library apps. It will also be beneficial for library authorities and system librarians in designing and developing digital libraries in academic settings.

Keywords: technology acceptance model (tam), perceived ease of use, perceived usefulness, information quality, system quality, meta-analysis, systematic review, digital libraries, and mobile library apps.

Procedia PDF Downloads 80
25162 Emerging Trends of Geographic Information Systems in Built Environment Education: A Bibliometric Review Analysis

Authors: Kiara Lawrence, Robynne Hansmann, Clive Greentsone

Abstract:

Geographic Information Systems (GIS) are used to store, analyze, visualize, capture and monitor geographic data. Built environment professionals as well as urban planners specifically, need to possess GIS skills to effectively and efficiently plan spaces. GIS application extends beyond the production of map artifacts and can be applied to relate to spatially referenced, real time data to support spatial visualization, analysis, community engagement, scenarios, and so forth. Though GIS has been used in the built environment for a few decades, its use in education has not been researched enough to draw conclusions on the trends in the last 20 years. The study looks to discover current and emerging trends of GIS in built environment education. A bibliometric review analysis methodology was carried out through exporting documents from Scopus and Web of Science using keywords around "Geographic information systems" OR "GIS" AND "built environment" OR “geography” OR "architecture" OR "quantity surveying" OR "construction" OR "urban planning" OR "town planning" AND “education” between the years 1994 to 2024. A total of 564 documents were identified and exported. The data was then analyzed using VosViewer software to generate network analysis and visualization maps on the co-occurrence of keywords, co-citation of documents and countries and co-author network analysis. By analyzing each aspect of the data, deeper insight of GIS within education can be understood. Preliminary results from Scopus indicate that GIS research focusing on built environment education seems to have peaked prior to 2014 with much focus on remote sensing, demography, land use, engineering education and so forth. This invaluable data can help in understanding and implementing GIS in built environment education in ways that are foundational and innovative to ensure that students are equipped with sufficient knowledge and skills to carry out tasks in their respective fields.

Keywords: architecture, built environment, construction, education, geography, geographic information systems, quantity surveying, town planning, urban planning

Procedia PDF Downloads 22
25161 Transmission Performance Analysis for Live Broadcasting over IPTV Service in Telemedicine Applications

Authors: Jenny K. Ubaque, Edward P. Guillen, Juan S. Solórzano, Leonardo J. Ramírez

Abstract:

The health care must be a right for people around the world, but in order to guarantee the access to all, it is necessary to overcome geographical barriers. Telemedicine take advantage of Information Communication Technologies to deploy health care services around the world. To achieve those goals, it is necessary to use existing last mile solution to create access for home users, which is why is necessary to establish the channel characteristics for those kinds of services. This paper presents an analysis of network performance of last mile solution for the use of IPTV broadcasting with the application of streaming for telemedicine apps.

Keywords: telemedicine, IPTV, GPON, ADSL2+, coaxial, jumbogram

Procedia PDF Downloads 374
25160 Analysis of a Generalized Sharma-Tasso-Olver Equation with Variable Coefficients

Authors: Fadi Awawdeh, O. Alsayyed, S. Al-Shará

Abstract:

Considering the inhomogeneities of media, the variable-coefficient Sharma-Tasso-Olver (STO) equation is hereby investigated with the aid of symbolic computation. A newly developed simplified bilinear method is described for the solution of considered equation. Without any constraints on the coefficient functions, multiple kink solutions are obtained. Parametric analysis is carried out in order to analyze the effects of the coefficient functions on the stabilities and propagation characteristics of the solitonic waves.

Keywords: Hirota bilinear method, multiple kink solution, Sharma-Tasso-Olver equation, inhomogeneity of media

Procedia PDF Downloads 521
25159 Error Analysis of Wavelet-Based Image Steganograhy Scheme

Authors: Geeta Kasana, Kulbir Singh, Satvinder Singh

Abstract:

In this paper, a steganographic scheme for digital images using Integer Wavelet Transform (IWT) is proposed. The cover image is decomposed into wavelet sub bands using IWT. Each of the subband is divided into blocks of equal size and secret data is embedded into the largest and smallest pixel values of each block of the subband. Visual quality of stego images is acceptable as PSNR between cover image and stego is above 40 dB, imperceptibility is maintained. Experimental results show better tradeoff between capacity and visual perceptivity compared to the existing algorithms. Maximum possible error analysis is evaluated for each of the wavelet subbands of an image.

Keywords: DWT, IWT, MSE, PSNR

Procedia PDF Downloads 510
25158 Impact of Civil Engineering and Economic Growth in the Sustainability of the Environment: Case of Albania

Authors: Rigers Dodaj

Abstract:

Nowadays, the environment is a critical goal for civil engineers, human activity, construction projects, economic growth, and whole national development. Regarding the development of Albania's economy, people's living standards are increasing, and the requirements for the living environment are also increasing. Under these circumstances, environmental protection and sustainability this is the critical issue. The rising industrialization, urbanization, and energy demand affect the environment by emission of carbon dioxide gas (CO2), a significant parameter known to impact air pollution directly. Consequently, many governments and international organizations conducted policies and regulations to address environmental degradation in the pursuit of economic development, for instance in Albania, the CO2 emission calculated in metric tons per capita has increased by 23% in the last 20 years. This paper analyzes the importance of civil engineering and economic growth in the sustainability of the environment focusing on CO2 emission. The analyzed data are time series 2001 - 2020 (with annual frequency), based on official publications of the World Bank. The statistical approach with vector error correction model and time series forecasting model are used to perform the parameter’s estimations and long-run equilibrium. The research in this paper adds a new perspective to the evaluation of a sustainable environment in the context of carbon emission reduction. Also, it provides reference and technical support for the government toward green and sustainable environmental policies. In the context of low-carbon development, effectively improving carbon emission efficiency is an inevitable requirement for achieving sustainable economic and environmental protection. Also, the study reveals that civil engineering development projects impact greatly the environment in the long run, especially in areas of flooding, noise pollution, water pollution, erosion, ecological disorder, natural hazards, etc. The potential for reducing industrial carbon emissions in recent years indicates that reduction is becoming more difficult, it needs another economic growth policy and more civil engineering development, by improving the level of industrialization and promoting technological innovation in industrial low-carbonization.

Keywords: CO₂ emission, civil engineering, economic growth, environmental sustainability

Procedia PDF Downloads 91
25157 Virtual Approach to Simulating Geotechnical Problems under Both Static and Dynamic Conditions

Authors: Varvara Roubtsova, Mohamed Chekired

Abstract:

Recent studies on the numerical simulation of geotechnical problems show the importance of considering the soil micro-structure. At this scale, soil is a discrete particle medium where the particles can interact with each other and with water flow under external forces, structure loads or natural events. This paper presents research conducted in a virtual laboratory named SiGran, developed at IREQ (Institut de recherche d’Hydro-Quebec) for the purpose of investigating a broad range of problems encountered in geotechnics. Using Discrete Element Method (DEM), SiGran simulated granular materials directly by applying Newton’s laws to each particle. The water flow was simulated by using Marker and Cell method (MAC) to solve the full form of Navier-Stokes’s equation for non-compressible viscous liquid. In this paper, examples of numerical simulation and their comparisons with real experiments have been selected to show the complexity of geotechnical research at the micro level. These examples describe transient flows into a porous medium, interaction of particles in a viscous flow, compacting of saturated and unsaturated soils and the phenomenon of liquefaction under seismic load. They also provide an opportunity to present SiGran’s capacity to compute the distribution and evolution of energy by type (particle kinetic energy, particle internal elastic energy, energy dissipated by friction or as a result of viscous interaction into flow, and so on). This work also includes the first attempts to apply micro discrete results on a macro continuum level where the Smoothed Particle Hydrodynamics (SPH) method was used to resolve the system of governing equations. The material behavior equation is based on the results of simulations carried out at a micro level. The possibility of combining three methods (DEM, MAC and SPH) is discussed.

Keywords: discrete element method, marker and cell method, numerical simulation, multi-scale simulations, smoothed particle hydrodynamics

Procedia PDF Downloads 306
25156 True and False Cognates of Japanese, Chinese and Philippine Languages: A Contrastive Analysis

Authors: Jose Marie E. Ocdenaria, Riceli C. Mendoza

Abstract:

Culturally, languages meet, merge, share, exchange, appropriate, donate, and divide in and to and from each other. Further, this type of recurrence manifests in East Asian cultures, where language influence diffuses across geographical proximities. Historically, China has notable impacts on Japan’s culture. For instance, Japanese borrowed words from China and their way of reading and writing. This qualitative and descriptive employing contrastive analysis study addressed the true and false cognates of Japanese-Philippine languages and Chinese-Philippine languages. It involved a rich collection of data from various sources like textual pieces of evidence or corpora to gain a deeper understanding of true and false cognates between L1 and L2. Cognates of Japanese-Philippine languages and Chinese-Philippine languages were analyzed contrastively according to orthography, phonology, and semantics. The words presented were the roots; however, derivatives, reduplications, and variants of stress were included when they shed emphases on the comparison. The basis of grouping the cognates was its phonetic-semantic resemblance. Based on the analysis, it revealed that there are words which may have several types of lexical relationship. Further, the study revealed that the Japanese language has more false cognates in the Philippine languages, particularly in Tagalog and Cebuano. On the other hand, there are more true cognates of Chinese in Tagalog. It is the hope of this study to provide a significant contribution to a diverse audience. These include the teachers and learners of foreign languages such as Japanese and Chinese, future researchers and investigators, applied linguists, curricular theorists, community, and publishers.

Keywords: Contrastive Analysis, Japanese, Chinese and Philippine languages, Qualitative and descriptive study, True and False Cognates

Procedia PDF Downloads 141
25155 Failure Analysis of Recoiler Mandrel Shaft Used for Coiling of Rolled Steel Sheet

Authors: Sachin Pawar, Suman Patra, Goutam Mukhopadhyay

Abstract:

The primary function of a shaft is to transfer power. The shaft can be cast or forged and then machined to the final shape. Manufacturing of ~5 m length and 0.6 m diameter shaft is very critical. More difficult is to maintain its straightness during heat treatment and machining operations, which involve thermal and mechanical loads, respectively. During the machining operation of a such forged mandrel shaft, a deflection of 3-4mm was observed. To remove this deflection shaft was pressed at both ends which led to the development of cracks in it. To investigate the root cause of the deflection and cracking, the sample was cut from the failed shaft. Possible causes were identified with the help of a cause and effect diagram. Chemical composition analysis, microstructural analysis, and hardness measurement were done to confirm whether the shaft meets the required specifications or not. Chemical composition analysis confirmed that the material grade was 42CrMo4. Microstructural analysis revealed the presence of untempered martensite, indicating improper heat treatment. Due to this, ductility and impact toughness values were considerably lower than the specification of the mentioned grade. Residual stress measurement of one more bent shaft manufactured by a similar route was done by portable X-ray diffraction(XRD) technique. For better understanding, measurements were done at twelve different locations along the length of the shaft. The occurrence of a high amount of undesirable tensile residual stresses close to the Ultimate Tensile Strength(UTS) of the material was observed. Untempered martensitic structure, lower ductility, lower impact strength, and presence of a high amount of residual stresses all confirmed the improper tempering heat treatment of the shaft. Tempering relieves the residual stresses. Based on the findings of this study, stress-relieving heat treatment was done to remove the residual stresses and deflection in the shaft successfully.

Keywords: residual stress, mandrel shaft, untempered martensite, portable XRD

Procedia PDF Downloads 117
25154 A Method To Assess Collaboration Using Perception of Risk from the Architectural Engineering Construction Industry

Authors: Sujesh F. Sujan, Steve W. Jones, Arto Kiviniemi

Abstract:

The use of Building Information Modelling (BIM) in the Architectural-Engineering-Construction (AEC) industry is a form of systemic innovation. Unlike incremental innovation, (such as the technological development of CAD from hand based drawings to 2D electronically printed drawings) any form of systemic innovation in Project-Based Inter-Organisational Networks requires complete collaboration and results in numerous benefits if adopted and utilised properly. Proper use of BIM involves people collaborating with the use of interoperable BIM compliant tools. The AEC industry globally has been known for its adversarial and fragmented nature where firms take advantage of one another to increase their own profitability. Due to the industry’s nature, getting people to collaborate by unifying their goals is critical to successful BIM adoption. However, this form of innovation is often being forced artificially in the old ways of working which do not suit collaboration. This may be one of the reasons for its low global use even though the technology was developed more than 20 years ago. Therefore, there is a need to develop a metric/method to support and allow industry players to gain confidence in their investment into BIM software and workflow methods. This paper departs from defining systemic risk as a risk that affects all the project participants at a given stage of a project and defines categories of systemic risks. The need to generalise is to allow method applicability to any industry where the category will be the same, but the example of the risk will depend on the industry the study is done in. The method proposed seeks to use individual perception of an example of systemic risk as a key parameter. The significance of this study lies in relating the variance of individual perception of systemic risk to how much the team is collaborating. The method bases its notions on the claim that a more unified range of individual perceptions would mean a higher probability that the team is collaborating better. Since contracts and procurement devise how a project team operates, the method could also break the methodological barrier of highly subjective findings that case studies inflict, which has limited the possibility of generalising between global industries. Since human nature applies in all industries, the authors’ intuition is that perception can be a valuable parameter to study collaboration which is essential especially in projects that utilise systemic innovation such as BIM.

Keywords: building information modelling, perception of risk, systemic innovation, team collaboration

Procedia PDF Downloads 189
25153 Responsibility of States in Air Traffic Management: Need for International Unification

Authors: Nandini Paliwal

Abstract:

Since aviation industry is one of the fastest growing sectors of the world economy, states depend on the air transport industry to maintain or stimulate economic growth. It significantly promotes and contributes to the economic well-being of every nation as well as world in general. Because of the continuous and rapid growth in civil aviation, it is inevitably leading to congested skies, flight delays and most alarmingly, a decrease in the safety of air navigation facilities. Safety is one of the most important concerns of aviation industry that has been unanimously recognised across the whole world. The available capacity of the air navigation system is not sufficient for the demand that is being generated. It has been indicated by forecast that the current growth in air traffic has the potential of causing delays in 20% of flights by 2020 unless changes are brought in the current system. Therefore, a safe, orderly and expeditious air navigation system is needed at the national and global levels, which, requires the implementation of an air traffic management (hereinafter referred as ‘ATM’) system to ensure an optimum flow of air traffic by utilising and enhancing capabilities provided by technical advances. The objective of this paper is to analyse the applicability of national regulations in case of liability arising out of air traffic management services and whether the current legal regime is sufficient to cover multilateral agreements including the Single European Sky regulations. In doing so, the paper will examine the international framework mainly the Article 28 of the Chicago Convention and its relevant annexes to determine the responsibility of states for providing air navigation services. Then, the paper will discuss the difference between the concept of responsibility and liability under the air law regime and how states might claim sovereign immunity for the functions of air traffic management. Thereafter, the paper will focus on the cross border agreements including the bilateral and multilateral agreements. In the end, the paper will address the scheme of Single European Sky and the need for an international convention dealing with the liability of air navigation service providers. The paper will conclude with some suggestions for unification of the laws at an international level dealing with liability of air navigation service providers and the requirement of enhanced co-operation among states in order to keep pace with technological advances.

Keywords: air traffic management, safety, single European sky, co-operation

Procedia PDF Downloads 173
25152 Overview About Sludge Produced From Treatment Plant of Bahr El-Baqar Drain and Reusing It With Cement in Outdoor Paving

Authors: Khaled M.Naguib, Ahmed M.Noureldin

Abstract:

This paper aims to achieve many goals such as knowing (quantities produced- main properties- characteristics) of sludge produced from Bahr EL-Baqar drains treatment plant. This prediction or projection was made by laboratory analysis and modelling of Model samples from sludge depending on many studies that have previously done, second check the feasibility and do a risk analysis to know the best alternatives for reuse in producing secondary products that add value to sludge. Also, to know alternatives that have no value to add. All recovery methods are relatively very expensive and challenging to be done in this mega plant, so the recommendation from this study is to use the sludge as a coagulant to reduce some compounds or in secondary products. The study utilized sludge-cement replacement percentages of 10%, 20%, 30%, 40% and 50%. Produced tiles were tested for water absorption and breaking (bending) strength. The study showed that all produced tiles exhibited a water absorption ratio of around 10%. The study concluded that produced tiles, except for 50% sludge-cement replacement, comply with the breaking strength requirements of 2.8 MPa for tiles for external use.

Keywords: cement, tiles, water treatment sludge, breaking strength, absorption, heavy metals, risk analysis

Procedia PDF Downloads 116
25151 Study of Water Cluster-Amorphous Silica Collisions in the Extreme Space Environment Using the ReaxFF Reactive Force Field Molecular Dynamics Simulation Method

Authors: Ali Rahnamoun, Adri van Duin

Abstract:

The concept of high velocity particle impact on the spacecraft surface materials has been one of the important issues in the design of such materials. Among these particles, water clusters might be the most abundant and the most important particles to be studied. The importance of water clusters is that upon impact on the surface of the materials, they can cause damage to the material and also if they are sub-cooled water clusters, they can attach to the surface of the materials and cause ice accumulation on the surface which is very problematic in spacecraft and also aircraft operations. The dynamics of the collisions between amorphous silica structures and water clusters with impact velocities of 1 km/s to 10 km/s are studied using the ReaxFF reactive molecular dynamics simulation method. The initial water clusters include 150 water molecules and the water clusters are collided on the surface of amorphous fully oxidized and suboxide silica structures. These simulations show that the most abundant molecules observed on the silica surfaces, other than reflecting water molecules, are H3O+ and OH- for the water cluster impacts on suboxide and fully oxidized silica structures, respectively. The effect of impact velocity on the change of silica mass is studied. At high impact velocities the water molecules attach to the silica surface through a chemisorption process meaning that water molecule dissociates through the interaction with silica surface. However, at low impact velocities, physisorbed water molecules are also observed, which means water molecule attaches and accumulates on the silica surface. The amount of physisorbed waters molecules at low velocities is higher on the suboxide silica surfaces. The evolution of the temperatures of the water clusters during the collisions indicates that the possibility of electron excitement at impact velocities less than 10 km/s is minimal and ReaxFF reactive molecular dynamics simulation can predict the chemistry of these hypervelocity impacts. However, at impact velocities close to 10 km/s the average temperature of the impacting water clusters increase to about 2000K, with individual molecules oocasionally reaching temperatures of over 8000K and thus will be prudent to consider the concept of electron excitation at these higher impact velocities which goes beyond the current ReaxFF ability.

Keywords: spacecraft materials, hypervelocity impact, reactive molecular dynamics simulation, amorphous silica

Procedia PDF Downloads 421
25150 Multiscale Entropy Analysis of Electroencephalogram (EEG) of Alcoholic and Control Subjects

Authors: Lal Hussain, Wajid Aziz, Imtiaz Ahmed Awan, Sharjeel Saeed

Abstract:

Multiscale entropy analysis (MSE) is a useful technique recently developed to quantify the dynamics of physiological signals at different time scales. This study is aimed at investigating the electroencephalogram (EEG) signals to analyze the background activity of alcoholic and control subjects by inspecting various coarse-grained sequences formed at different time scales. EEG recordings of alcoholic and control subjects were taken from the publically available machine learning repository of University of California (UCI) acquired using 64 electrodes. The MSE analysis was performed on the EEG data acquired from all the electrodes of alcoholic and control subjects. Mann-Whitney rank test was used to find significant differences between the groups and result were considered statistically significant for p-values<0.05. The area under receiver operator curve was computed to find the degree separation between the groups. The mean ranks of MSE values at all the times scales for all electrodes were higher control subject as compared to alcoholic subjects. Higher mean ranks represent higher complexity and vice versa. The finding indicated that EEG signals acquired through electrodes C3, C4, F3, F7, F8, O1, O2, P3, T7 showed significant differences between alcoholic and control subjects at time scales 1 to 5. Moreover, all electrodes exhibit significance level at different time scales. Likewise, the highest accuracy and separation was obtained at the central region (C3 and C4), front polar regions (P3, O1, F3, F7, F8 and T8) while other electrodes such asFp1, Fp2, P4 and F4 shows no significant results.

Keywords: electroencephalogram (EEG), multiscale sample entropy (MSE), Mann-Whitney test (MMT), Receiver Operator Curve (ROC), complexity analysis

Procedia PDF Downloads 378
25149 Intertextuality as a Dialogue Between Postmodern Writer J. Fowles and Mid-English Writer J. Donne

Authors: Isahakyan Heghine

Abstract:

Intertextuality, being in the centre of attention of both linguists and literary critics, is vividly expressed in the outstanding British novelist and philosopher J. Fowles' works. 'The Magus’ is a deep psychological and philosophical novel with vivid intertextual links with the Greek mythology and authors from different epochs. The aim of the paper is to show how intertextuality might serve as a dialogue between two authors (J. Fowles and J. Donne) disguised in the dialogue of two protagonists of the novel : Conchis and Nicholas. Contrastive viewpoints concerning man's isolation, loneliness are stated in the dialogue. Due to the conceptual analysis of the text it becomes possible both to decode the conceptual information of the text and find out its intertextual links.

Keywords: dialogue, conceptual analysis, isolation, intertextuality

Procedia PDF Downloads 332
25148 Network Analysis of Genes Involved in the Biosynthesis of Medicinally Important Naphthodianthrone Derivatives of Hypericum perforatum

Authors: Nafiseh Noormohammadi, Ahmad Sobhani Najafabadi

Abstract:

Hypericins (hypericin and pseudohypericin) are natural napthodianthrone derivatives produced by Hypericum perforatum (St. John’s Wort), which have many medicinal properties such as antitumor, antineoplastic, antiviral, and antidepressant activities. Production and accumulation of hypericin in the plant are influenced by both genetic and environmental conditions. Despite the existence of different high-throughput data on the plant, genetic dimensions of hypericin biosynthesis have not yet been completely understood. In this research, 21 high-quality RNA-seq data on different parts of the plant were integrated into metabolic data to reconstruct a coexpression network. Results showed that a cluster of 30 transcripts was correlated with total hypericin. The identified transcripts were divided into three main groups based on their functions, including hypericin biosynthesis genes, transporters, detoxification genes, and transcription factors (TFs). In the biosynthetic group, different isoforms of polyketide synthase (PKSs) and phenolic oxidative coupling proteins (POCPs) were identified. Phylogenetic analysis of protein sequences integrated into gene expression analysis showed that some of the POCPs seem to be very important in the biosynthetic pathway of hypericin. In the TFs group, six TFs were correlated with total hypericin. qPCR analysis of these six TFs confirmed that three of them were highly correlated. The identified genes in this research are a rich resource for further studies on the molecular breeding of H. perforatum in order to obtain varieties with high hypericin production.

Keywords: hypericin, St. John’s Wort, data mining, transcription factors, secondary metabolites

Procedia PDF Downloads 97
25147 Value Chain with the Participation of Urban Agriculture Development by Social Enterprises

Authors: Kuo-Wei Hsu, Wei-Chin Lo

Abstract:

In these years, urban agriculture development has been wide spreading all over the world. The development of urban agriculture is an evolution process of highly urbanization, as well as an agricultural phenomenon closely related to the development of economy, society and culture in urban areas. It provides densely populated areas with multi-functional uses of land, impacting strategic development of both large and small towns in the area. In addition, the participation of social enterprises keeps industrial competitiveness and makes gains when facing rapid transformation of industrial structures and new patterns of lifestyles in urban areas. They create better living conditions as well as protect the environment with innovative business beliefs, which give new ways for development of urban agriculture. Also, through building up the value chain, these social enterprises are capable of creating value for urban agriculture. Most of research regarding to social enterprises currently explore the relationship between corporate responsibilities and its role play, operational mode and performance and organizational patterns. Merely some of them discuss the function of social entrepreneurship in the development of urban agriculture. Moreover, none of them have explored the value creation for development of urban agriculture processed by social enterprises, as well as how social enterprises operate to increase competitive advantages, which make it possible to achieve industrial innovation, increase corporate value and even provide services with value creation. Therefore, this research mainly reviews current business patterns and operational conditions of social enterprises. This research endowed social responsibilities, and discusses current development process of urban agriculture. This research adopts Value Chain perspective to discuss key factors for value creation with respect to the development of urban agriculture processed by social enterprises. Thereby after organization and integration this research develops the prospect of value creation referring to urban agriculture processed by social enterprises and builds the value chain for urban agriculture. In conclusion, this research explored the relationship between value chain and value creation, which relates to values of customer, enterprise, society and economy referring to the development of urban agriculture uniquely, in consideration of the participation of social enterprises, and hence built the connection between value chain and value creation in the development of urban agriculture by social enterprises. The research found, social enterprises help to enhance the connection between the enterprise value and society value, mold corporate image with social responsibility and create brand value, and therefore impact the increase of economic value.

Keywords: urban agriculture development, value chain, social enterprise, urban systems

Procedia PDF Downloads 484
25146 Discrimination in Insurance Pricing: A Textual-Analysis Perspective

Authors: Ruijuan Bi

Abstract:

Discrimination in insurance pricing is a topic of increasing concern, particularly in the context of the rapid development of big data and artificial intelligence. There is a need to explore the various forms of discrimination, such as direct and indirect discrimination, proxy discrimination, algorithmic discrimination, and unfair discrimination, and understand their implications in insurance pricing models. This paper aims to analyze and interpret the definitions of discrimination in insurance pricing and explore measures to reduce discrimination. It utilizes a textual analysis methodology, which involves gathering qualitative data from relevant literature on definitions of discrimination. The research methodology focuses on exploring the various forms of discrimination and their implications in insurance pricing models. Through textual analysis, this paper identifies the specific characteristics and implications of each form of discrimination in the general insurance industry. This research contributes to the theoretical understanding of discrimination in insurance pricing. By analyzing and interpreting relevant literature, this paper provides insights into the definitions of discrimination and the laws and regulations surrounding it. This theoretical foundation can inform future empirical research on discrimination in insurance pricing using relevant theories of probability theory.

Keywords: algorithmic discrimination, direct and indirect discrimination, proxy discrimination, unfair discrimination, insurance pricing

Procedia PDF Downloads 77
25145 Getting to Know the Enemy: Utilization of Phone Record Analysis Simulations to Uncover a Target’s Personal Life Attributes

Authors: David S. Byrne

Abstract:

The purpose of this paper is to understand how phone record analysis can enable identification of subjects in communication with a target of a terrorist plot. This study also sought to understand the advantages of the implementation of simulations to develop the skills of future intelligence analysts to enhance national security. Through the examination of phone reports which in essence consist of the call traffic of incoming and outgoing numbers (and not by listening to calls or reading the content of text messages), patterns can be uncovered that point toward members of a criminal group and activities planned. Through temporal and frequency analysis, conclusions were drawn to offer insights into the identity of participants and the potential scheme being undertaken. The challenge lies in the accurate identification of the users of the phones in contact with the target. Often investigators rely on proprietary databases and open sources to accomplish this task, however it is difficult to ascertain the accuracy of the information found. Thus, this paper poses two research questions: how effective are freely available web sources of information at determining the actual identification of callers? Secondly, does the identity of the callers enable an understanding of the lifestyle and habits of the target? The methodology for this research consisted of the analysis of the call detail records of the author’s personal phone activity spanning the period of a year combined with a hypothetical theory that the owner of said phone was a leader of terrorist cell. The goal was to reveal the identity of his accomplices and understand how his personal attributes can further paint a picture of the target’s intentions. The results of the study were interesting, nearly 80% of the calls were identified with over a 75% accuracy rating via datamining of open sources. The suspected terrorist’s inner circle was recognized including relatives and potential collaborators as well as financial institutions [money laundering], restaurants [meetings], a sporting goods store [purchase of supplies], and airline and hotels [travel itinerary]. The outcome of this research showed the benefits of cellphone analysis without more intrusive and time-consuming methodologies though it may be instrumental for potential surveillance, interviews, and developing probable cause for wiretaps. Furthermore, this research highlights the importance of building upon the skills of future intelligence analysts through phone record analysis via simulations; that hands-on learning in this case study emphasizes the development of the competencies necessary to improve investigations overall.

Keywords: hands-on learning, intelligence analysis, intelligence education, phone record analysis, simulations

Procedia PDF Downloads 20
25144 Legal Analysis of the Meaning of the Rule In dubio pro libertate for the Interpretation of Criminal Law Norms

Authors: Pavel Kotlán

Abstract:

The paper defines the role of the rule in dubio pro libertate in the interpretation of criminal law norms, which is one of the controversial and debated problems of law application. On the basis of the analysis of the law, including comparison with the legal systems of various European countries, and the accepted principles of interpretation of law, it can be concluded that the rule in dubio pro libertate can be used in cases where the linguistic, teleological and systematic methods fail, and at the same time, that interpretation based on this rule should be preferred to subjective historical interpretation. It can be considered that the correct inclusion of the in dubio pro libertate rule in the choice of the interpretative variant can serve in the application of criminal law by the judiciary.

Keywords: application of law, criminal law norms, in dubio pro libertate, interpretation

Procedia PDF Downloads 17
25143 Fluctuations in Motivational Strategies EFL Teachers Use in Virtual and In-Person Classes across Context

Authors: Sima Modirkhamene, Arezoo Khezri

Abstract:

The purpose of the present investigation was to probe the main motivational strategies Iranian school vs. institute teachers use in virtual and in-person classes to motivate students in learning the English language. Yet another purpose was to understand teachers’ perceptions about any modifications in their use of motivational strategies before and during/after the pandemic. For the purpose of this investigation, a total of 63 EFL teachers (35 female, 28 male) were conveniently sampled from schools and institutes in the cities of Mahabad and Sardasht. Moreover, for the interview phase of the study, 20 percent (n=16) of the sample was selected conveniently. The required data was gathered through a modified questionnaire (Cheng & Dornyei, 2007) consisting of 42 items and a set of semi-structured interviews. The outcomes of a set of non-parametric Mann-Whitney U tests demonstrated that presenting tasks properly in online classes and familiarizing learners with L2- related values in in-person classes came out as the most influential source of motivational strategies practiced by EFL school teachers. Additionally, it was found that proper teacher behavior(showing enthusiasm) in both in-person and virtual classes and presenting tasks properly in in-person classes were overwhelmingly endorsed by EFL institute teachers. The study also portrayed no statistically significant mean difference between school and institute EFL teachers’ overall use of motivational strategies in virtual and in-person classes. The interview results indicated that the strategies of designing tasks through technological aids, provision of videos, gamification techniques, assigning projects, and delivering formative online feedback were held in high regard during/after the pandemic due to the high reliance of teaching on the Internet connection. Meanwhile, the research has indicated that the spread of COVID-19 was the main reason for teachers’ modifications in motivational strategies, in response to the crisis of the pandemic, all educational contexts at all levels resorted to online education as a result their strategies were adapted to the new situation. The findings brought to light through this investigation provided initial evidence of the unintended consequences of the pandemic on teachers’ strategic choices. Therefore, to deliver a better education for the future, the study suggests more concentration on the quality of teaching as well as reframing the status quo of teaching .

Keywords: virtual teaching, motivational teaching strategies, teaching context, online education

Procedia PDF Downloads 62
25142 An Analysis of Non-Elliptic Curve Based Primality Tests

Authors: William Wong, Zakaria Alomari, Hon Ching Lai, Zhida Li

Abstract:

Modern-day information security depends on implementing Diffie-Hellman, which requires the generation of prime numbers. Because the number of primes is infinite, it is impractical to store prime numbers for use, and therefore, primality tests are indispensable in modern-day information security. A primality test is a test to determine whether a number is prime or composite. There are two types of primality tests, which are deterministic tests and probabilistic tests. Deterministic tests are adopting algorithms that provide a definite answer whether a given number is prime or composite. While in probabilistic tests, a probabilistic result would be provided, there is a degree of uncertainty. In this paper, we review three probabilistic tests: the Fermat Primality Test, the Miller-Rabin Test, and the Baillie-PSW Test, as well as one deterministic test, the Agrawal-Kayal-Saxena (AKS) Test. Furthermore, we do an analysis of these tests. All of the reviews discussed are not based on the Elliptic Curve. The analysis demonstrates that, in the majority of real-world scenarios, the Baillie- PSW test’s favorability stems from its typical operational complexity of O(log 3n) and its capacity to deliver accurate results for numbers below 2^64.

Keywords: primality tests, Fermat’s primality test, Miller-Rabin primality test, Baillie-PSW primality test, AKS primality test

Procedia PDF Downloads 95
25141 Optimized Parameters for Simultaneous Detection of Cd²⁺, Pb²⁺ and CO²⁺ Ions in Water Using Square Wave Voltammetry on the Unmodified Glassy Carbon Electrode

Authors: K. Sruthi, Sai Snehitha Yadavalli, Swathi Gosh Acharyya

Abstract:

Water is the most crucial element for sustaining life on earth. Increasing water pollution directly or indirectly leads to harmful effects on human life. Most of the heavy metal ions are harmful in their cationic form. These heavy metal ions are released by various activities like disposing of batteries, industrial wastes, automobile emissions, and soil contamination. Ions like (Pb, Co, Cd) are carcinogenic and show many harmful effects when consumed more than certain limits proposed by WHO. The simultaneous detection of the heavy metal ions (Pb, Co, Cd), which are highly toxic, is reported in this study. There are many analytical methods for quantifying, but electrochemical techniques are given high priority because of their sensitivity and ability to detect and recognize lower concentrations. Square wave voltammetry was preferred in electrochemical methods due to the absence of background currents which is interference. Square wave voltammetry was performed on GCE for the quantitative detection of ions. Three electrode system consisting of a glassy carbon electrode as the working electrode (3 mm diameter), Ag/Agcl electrode as the reference electrode, and a platinum wire as the counter electrode was chosen for experimentation. The mechanism of detection was done by optimizing the experimental parameters, namely pH, scan rate, and temperature. Under the optimized conditions, square wave voltammetry was performed for simultaneous detection. Scan rates were varied from 5 mV/s to 100 mV/s and found that at 25 mV/s all the three ions were detected simultaneously with proper peaks at particular stripping potential. The variation of pH from 3 to 8 was done where the optimized pH was taken as pH 5 which holds good for three ions. There was a decreasing trend at starting because of hydrogen gas evolution, and after pH 5 again there was a decreasing trend that is because of hydroxide formation on the surface of the working electrode (GCE). The temperature variation from 25˚C to 45˚C was done where the optimum temperature concerning three ions was taken as 35˚C. Deposition and stripping potentials were given as +1.5 V and -1.5 V, and the resting time of 150 seconds was given. Three ions were detected at stripping potentials of Cd²⁺ at -0.84 V, Pb²⁺ at -0.54 V, and Co²⁺ at -0.44 V. The parameters of detection were optimized on a glassy carbon electrode for simultaneous detection of the ions at lower concentrations by square wave voltammetry.

Keywords: cadmium, cobalt, lead, glassy carbon electrode, square wave anodic stripping voltammetry

Procedia PDF Downloads 122
25140 Open Forging of Cylindrical Blanks Subjected to Lateral Instability

Authors: A. H. Elkholy, D. M. Almutairi

Abstract:

The successful and efficient execution of a forging process is dependent upon the correct analysis of loading and metal flow of blanks. This paper investigates the Upper Bound Technique (UBT) and its application in the analysis of open forging process when a possibility of blank bulging exists. The UBT is one of the energy rate minimization methods for the solution of metal forming process based on the upper bound theorem. In this regards, the kinematically admissible velocity field is obtained by minimizing the total forging energy rate. A computer program is developed in this research to implement the UBT. The significant advantages of this method is the speed of execution while maintaining a fairly high degree of accuracy and the wide prediction capability. The information from this analysis is useful for the design of forging processes and dies. Results for the prediction of forging loads and stresses, metal flow and surface profiles with the assured benefits in terms of press selection and blank preform design are outlined in some detail. The obtained predictions are ready for comparison with both laboratory and industrial results.

Keywords: forging, upper bound technique, metal forming, forging energy, forging die/platen

Procedia PDF Downloads 295
25139 A Transient Coupled Numerical Analysis of the Flow of Magnetorheological Fluids in Closed Domains

Authors: Wael Elsaady, S. Olutunde Oyadiji, Adel Nasser

Abstract:

The non-linear flow characteristics of magnetorheological (MR) fluids in MR dampers are studied via a coupled numerical approach that incorporates a two-phase flow model. The approach couples the Finite Element (FE) modelling of the damper magnetic circuit, with the Computational Fluid Dynamics (CFD) analysis of the flow field in the damper. The two-phase flow CFD model accounts for the effect of fluid compressibility due to the presence of liquid and gas in the closed domain of the damper. The dynamic mesh model included in ANSYS/Fluent CFD solver is used to simulate the movement of the MR damper piston in order to perform the fluid excitation. The two-phase flow analysis is studied by both Volume-Of-Fluid (VOF) model and mixture model that are included in ANSYS/Fluent. The CFD models show that the hysteretic behaviour of MR dampers is due to the effect of fluid compressibility. The flow field shows the distributions of pressure, velocity, and viscosity contours. In particular, it shows the high non-Newtonian viscosity in the affected fluid regions by the magnetic field and the low Newtonian viscosity elsewhere. Moreover, the dependence of gas volume fraction on the liquid pressure inside the damper is predicted by the mixture model. The presented approach targets a better understanding of the complicated flow characteristics of viscoplastic fluids that could be applied in different applications.

Keywords: viscoplastic fluid, magnetic FE analysis, computational fluid dynamics, two-phase flow, dynamic mesh, user-defined functions

Procedia PDF Downloads 177