Search results for: fixed smeared crack model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17621

Search results for: fixed smeared crack model

8081 Characterization of an Extrapolation Chamber for Dosimetry of Low Energy X-Ray Beams

Authors: Fernanda M. Bastos, Teógenes A. da Silva

Abstract:

Extrapolation chambers were designed to be used as primary standard dosimeter for measuring absorbed dose in a medium in beta radiation and low energy x-rays. The International Organization for Standardization established series of reference x-radiation for calibrating and determining the energy dependence of dosimeters that are to be reproduced in metrology laboratories. Standardization of the low energy x-ray beams with tube potential lower than 30 kV may be affected by the instrument used for dosimetry. In this work, parameters of a 23392 model PTW extrapolation chamber were determined aiming its use in low energy x-ray beams as a reference instrument.

Keywords: extrapolation chamber, low energy x-rays, x-ray dosimetry, X-ray metrology

Procedia PDF Downloads 371
8080 Heritage, Cultural Events and Promises for Better Future: Media Strategies for Attracting Tourism during the Arab Spring Uprisings

Authors: Eli Avraham

Abstract:

The Arab Spring was widely covered in the global media and the number of Western tourists traveling to the area began to fall. The goal of this study was to analyze which media strategies marketers in Middle Eastern countries chose to employ in their attempts to repair the negative image of the area in the wake of the Arab Spring. Several studies were published concerning image-restoration strategies of destinations during crises around the globe; however, these strategies were not part of an overarching theory, conceptual framework or model from the fields of crisis communication and image repair. The conceptual framework used in the current study was the ‘multi-step model for altering place image’, which offers three types of strategies: source, message and audience. Three research questions were used: 1.What public relations crisis techniques and advertising campaign components were used? 2. What media policies and relationships with the international media were adopted by Arab officials? 3. Which marketing initiatives (such as cultural and sports events) were promoted? This study is based on qualitative content analysis of four types of data: 1) advertising components (slogans, visuals and text); (2) press interviews with Middle Eastern officials and marketers; (3) official media policy adopted by government decision-maker (e.g. boycotting or arresting newspeople); and (4) marketing initiatives (e.g. organizing heritage festivals and cultural events). The data was located in three channels from December 2010, when the events started, to September 31, 2013: (1) Internet and video-sharing websites: YouTube and Middle Eastern countries' national tourism board websites; (2) News reports from two international media outlets, The New York Times and Ha’aretz; these are considered quality newspapers that focus on foreign news and tend to criticize institutions; (3) Global tourism news websites: eTurbo news and ‘Cities and countries branding’. Using the ‘multi-step model for altering place image,’ the analysis reveals that Middle Eastern marketers and officials used three kinds of strategies to repair their countries' negative image: 1. Source (cooperation and media relations; complying, threatening and blocking the media; and finding alternatives to the traditional media) 2. Message (ignoring, limiting, narrowing or reducing the scale of the crisis; acknowledging the negative effect of an event’s coverage and assuring a better future; promotion of multiple facets, exhibitions and softening the ‘hard’ image; hosting spotlight sporting and cultural events; spinning liabilities into assets; geographic dissociation from the Middle East region; ridicule the existing stereotype) and 3. Audience (changing the target audience by addressing others; emphasizing similarities and relevance to specific target audience). It appears that dealing with their image problems will continue to be a challenge for officials and marketers of Middle Eastern countries until the region stabilizes and its regional conflicts are resolved.

Keywords: Arab spring, cultural events, image repair, Middle East, tourism marketing

Procedia PDF Downloads 259
8079 The Generalized Pareto Distribution as a Model for Sequential Order Statistics

Authors: Mahdy ‎Esmailian, Mahdi ‎Doostparast, Ahmad ‎Parsian

Abstract:

‎In this article‎, ‎sequential order statistics (SOS) censoring type II samples coming from the generalized Pareto distribution are considered‎. ‎Maximum likelihood (ML) estimators of the unknown parameters are derived on the basis of the available multiple SOS data‎. ‎Necessary conditions for existence and uniqueness of the derived ML estimates are given‎. Due to complexity in the proposed likelihood function‎, ‎a useful re-parametrization is suggested‎. ‎For illustrative purposes‎, ‎a Monte Carlo simulation study is conducted and an illustrative example is analysed‎.

Keywords: bayesian estimation‎, generalized pareto distribution‎, ‎maximum likelihood estimation‎, sequential order statistics

Procedia PDF Downloads 477
8078 Decentralised Edge Authentication in the Industrial Enterprise IoT Space

Authors: C. P. Autry, A.W. Roscoe

Abstract:

Authentication protocols based on public key infrastructure (PKI) and trusted third party (TTP) are no longer adequate for industrial scale IoT networks thanks to issues such as low compute and power availability, the use of widely distributed and commercial off-the-shelf (COTS) systems, and the increasingly sophisticated attackers and attacks we now have to counter. For example, there is increasing concern about nation-state-based interference and future quantum computing capability. We have examined this space from first principles and have developed several approaches to group and point-to-point authentication for IoT that do not depend on the use of a centralised client-server model. We emphasise the use of quantum resistant primitives such as strong cryptographic hashing and the use multi-factor authentication.

Keywords: authentication, enterprise IoT cybersecurity, PKI/TTP, IoT space

Procedia PDF Downloads 142
8077 Adaptive Approach Towards Comprehensive Urban Development Simulation in Coastal Regions: Case Study of New Alamein City, Egypt

Authors: Nada Mohamed, Abdel Aziz Mohamed

Abstract:

Climate change in coastal areas is a global issue that can be felt on local scale and will be around for decades and centuries to come to an end; it also has critical risks on the city’s economy, communities, and the natural environment. One of these changes that cause a huge risk on coastal cities is the sea level rise (SLR). SLR is a result of scarcity and reduction in global environmental system. The main cause of climate change and global warming is the countries with high development index (HDI) as Japan and Germany while the medium and low HDI countries as Egypt does not have enough awareness and advanced tactics to adapt with this changes that destroy urban areas and cause loss in land and economy. This is why Climate Resilience is one of the UN sustainable development goals 2030, which is calling for actions to strengthen climate change resilience through mitigation and adaptation. For many reasons, adaptation has received less attention than mitigation and it is only recently that adaptation has become a focal global point of attention. This adaption can be achieved through some actions such as upgrading the use and the design of the land, adjusting business and activities of people, and increasing community understanding of climate risks. To reach the adaption goals, and we have to apply a strategic pathway to Climate Resilience, which is the Urban Bioregionalism Paradigm. Resiliency has been framed as persistence, adaptation, and transformation. Climate Resilience decision support system includes a visualization platform where ecological, social, and economic information can be viewed alongside with specific geographies that's why Urban Bioregionalism is a socio-ecological system which is defined as a paradigm that has potential to help move social attitudes toward environmental understanding and deepen human-environment connections within ecological development. The research aim is to achieve an adaptive integrated urban development model throughout the analyses of tactics and strategies that can be used to adapt urban areas and coastal communities to the challenges of climate changes especially SLR and also simulation model using advanced technological software for a coastal city corridor to elaborates the suitable strategy to apply.

Keywords: climate resilience, sea level rise, SLR, coastal resilience, adaptive development simulation

Procedia PDF Downloads 109
8076 Estimation of Maximum Earthquake for Gujarat Region, India

Authors: Ashutosh Saxena, Kumar Pallav, Ramji Dwivedi

Abstract:

The present study estimates the seismicity parameter 'b' and maximum possible magnitude of an earthquake (Mmax) for Gujarat region with three well-established methods viz. Kijiko parametric model (KP), Kijiko-Sellevol-Bayern (KSB) and Tapered Gutenberg-Richter (TGR), as a combined seismic source regime. The earthquake catalogue is prepared for a period of 1330 to 2013 in the region Latitudes 20o N to 250 N and Longitudinally extending from 680 to 750 E for earthquake moment magnitude (Mw) ≥4.0. The ’a’ and 'b' value estimated for the region as 4.68 and 0.58. Further, Mmax estimated as 8.54 (± 0.29), 8.69 (± 0.48), and 8.12 with KP, KSB, and TGR, respectively.

Keywords: Mmax, seismicity parameter, Gujarat, Tapered Gutenberg-Richter

Procedia PDF Downloads 512
8075 Simulation of Focusing of Diamagnetic Particles in Ferrofluid Microflows with a Single Set of Overhead Permanent Magnets

Authors: Shuang Chen, Zongqian Shi, Jiajia Sun, Mingjia Li

Abstract:

Microfluidics is a technology that small amounts of fluids are manipulated using channels with dimensions of tens to hundreds of micrometers. At present, this significant technology is required for several applications in some fields, including disease diagnostics, genetic engineering, and environmental monitoring, etc. Among these fields, manipulation of microparticles and cells in microfluidic device, especially separation, have aroused general concern. In magnetic field, the separation methods include positive and negative magnetophoresis. By comparison, negative magnetophoresis is a label-free technology. It has many advantages, e.g., easy operation, low cost, and simple design. Before the separation of particles or cells, focusing them into a single tight stream is usually a necessary upstream operation. In this work, the focusing of diamagnetic particles in ferrofluid microflows with a single set of overhead permanent magnets is investigated numerically. The geometric model of the simulation is based on the configuration of previous experiments. The straight microchannel is 24mm long and has a rectangular cross-section of 100μm in width and 50μm in depth. The spherical diamagnetic particles of 10μm in diameter are suspended into ferrofluid. The initial concentration of the ferrofluid c₀ is 0.096%, and the flow rate of the ferrofluid is 1.8mL/h. The magnetic field is induced by five identical rectangular neodymium−iron− boron permanent magnets (1/8 × 1/8 × 1/8 in.), and it is calculated by equivalent charge source (ECS) method. The flow of the ferrofluid is governed by the Navier–Stokes equations. The trajectories of particles are solved by the discrete phase model (DPM) in the ANSYS FLUENT program. The positions of diamagnetic particles are recorded by transient simulation. Compared with the results of the mentioned experiments, our simulation shows consistent results that diamagnetic particles are gradually focused in ferrofluid under magnetic field. Besides, the diamagnetic particle focusing is studied by varying the flow rate of the ferrofluid. It is in agreement with the experiment that the diamagnetic particle focusing is better with the increase of the flow rate. Furthermore, it is investigated that the diamagnetic particle focusing is affected by other factors, e.g., the width and depth of the microchannel, the concentration of the ferrofluid and the diameter of diamagnetic particles.

Keywords: diamagnetic particle, focusing, microfluidics, permanent magnet

Procedia PDF Downloads 109
8074 Value Co-Creation in Used-Car Auctions: A Service Scientific Perspective

Authors: Safdar Muhammad Usman, Youji Kohda, Katsuhiro Umemoto

Abstract:

Electronic market place plays an important intermediary role for connecting dealers and retail customers. The main aim of this paper is to design a value co-creation model in used-car auctions. More specifically, the study has been designed in order to describe the process of value co-creation in used-car auctions, to explore the co-created values in used-car auctions, and finally conclude the paper indicating the future research directions. Our analysis shows that economic values as well as non-economic values are co-created in used-car auctions. In addition, this paper contributes to the academic society broadening the view of value co-creation in service science.

Keywords: value co-creation, used-car auctions, non-financial values, service science

Procedia PDF Downloads 326
8073 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 247
8072 Prediction of Alzheimer's Disease Based on Blood Biomarkers and Machine Learning Algorithms

Authors: Man-Yun Liu, Emily Chia-Yu Su

Abstract:

Alzheimer's disease (AD) is the public health crisis of the 21st century. AD is a degenerative brain disease and the most common cause of dementia, a costly disease on the healthcare system. Unfortunately, the cause of AD is poorly understood, furthermore; the treatments of AD so far can only alleviate symptoms rather cure or stop the progress of the disease. Currently, there are several ways to diagnose AD; medical imaging can be used to distinguish between AD, other dementias, and early onset AD, and cerebrospinal fluid (CSF). Compared with other diagnostic tools, blood (plasma) test has advantages as an approach to population-based disease screening because it is simpler, less invasive also cost effective. In our study, we used blood biomarkers dataset of The Alzheimer’s disease Neuroimaging Initiative (ADNI) which was funded by National Institutes of Health (NIH) to do data analysis and develop a prediction model. We used independent analysis of datasets to identify plasma protein biomarkers predicting early onset AD. Firstly, to compare the basic demographic statistics between the cohorts, we used SAS Enterprise Guide to do data preprocessing and statistical analysis. Secondly, we used logistic regression, neural network, decision tree to validate biomarkers by SAS Enterprise Miner. This study generated data from ADNI, contained 146 blood biomarkers from 566 participants. Participants include cognitive normal (healthy), mild cognitive impairment (MCI), and patient suffered Alzheimer’s disease (AD). Participants’ samples were separated into two groups, healthy and MCI, healthy and AD, respectively. We used the two groups to compare important biomarkers of AD and MCI. In preprocessing, we used a t-test to filter 41/47 features between the two groups (healthy and AD, healthy and MCI) before using machine learning algorithms. Then we have built model with 4 machine learning methods, the best AUC of two groups separately are 0.991/0.709. We want to stress the importance that the simple, less invasive, common blood (plasma) test may also early diagnose AD. As our opinion, the result will provide evidence that blood-based biomarkers might be an alternative diagnostics tool before further examination with CSF and medical imaging. A comprehensive study on the differences in blood-based biomarkers between AD patients and healthy subjects is warranted. Early detection of AD progression will allow physicians the opportunity for early intervention and treatment.

Keywords: Alzheimer's disease, blood-based biomarkers, diagnostics, early detection, machine learning

Procedia PDF Downloads 297
8071 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers

Procedia PDF Downloads 37
8070 Intracellular Sphingosine-1-Phosphate Receptor 3 Contributes to Lung Tumor Cell Proliferation

Authors: Michela Terlizzi, Chiara Colarusso, Aldo Pinto, Rosalinda Sorrentino

Abstract:

Sphingosine-1-phosphate (S1P) is a membrane-derived bioactive phospholipid exerting a multitude of effects on respiratory cell physiology and pathology through five S1P receptors (S1PR1-5). Higher levels of S1P have been registered in a broad range of respiratory diseases, including inflammatory disorders and cancer, although its exact role is still elusive. Based on our previous study in which we found that S1P/S1PR3 is involved in an inflammatory pattern via the activation of Toll-like Receptor 9 (TLR9), highly expressed on lung cancer cells, the main goal of the current study was to better understand the involvement of S1P/S1PR3 pathway/signaling during lung carcinogenesis, taking advantage of a mouse model of first-hand smoke exposure and of carcinogen-induced lung cancer. We used human samples of Non-Small Cell Lung Cancer (NSCLC), a mouse model of first-hand smoking, and of Benzo(a)pyrene (BaP)-induced tumor-bearing mice and A549 lung adenocarcinoma cells. We found that the intranuclear, but not the membrane, localization of S1PR3 was associated to the proliferation of lung adenocarcinoma cells, the mechanism that was correlated to human and mouse samples of smoke-exposure and carcinogen-induced lung cancer, which were characterized by higher utilization of S1P. Indeed, the inhibition of the membrane S1PR3 did not alter tumor cell proliferation after TLR9 activation. Instead, according to the nuclear localization of sphingosine kinase (SPHK) II, the enzyme responsible for the catalysis of the S1P last step synthesis, the inhibition of the kinase completely blocked the endogenous S1P-induced tumor cell proliferation. These results prove that the endogenous TLR9-induced S1P can on one side favor pro-inflammatory mechanisms in the tumor microenvironment via the activation of cell surface receptors, but on the other tumor progression via the nuclear S1PR3/SPHK II axis, highlighting a novel molecular mechanism that identifies S1P as one of the crucial mediators for lung carcinogenesis-associated inflammatory processes and that could provide differential therapeutic approaches especially in non-responsive lung cancer patients.

Keywords: sphingosine-1-phosphate (S1P), S1P Receptor 3 (S1PR3), smoking-mice, lung inflammation, lung cancer

Procedia PDF Downloads 174
8069 Top Management Characteristics and Adoption of Internet Banking: Case Study of the Tunisian Banking Sector

Authors: Dorra Gherib

Abstract:

This article explores in depth the technological innovations by the Top Managements of banks in the Tunisian banking sector. The framework of this research is based on an amalgamation of four theories related to the decision of adopting technological innovations: The Theory of Reasoned Action (TRA), the Theory of Planned Behaviour (TPB), Technology Acceptance Model (TAM), and Diffusion of Innovation (DI). The result of our qualitative study highlights four variables which influence the attitude of the Top Managements towards the adoption of internet banking: Relative advantage, Perceived Ease of Use, compatibility and Perceived risk.

Keywords: top management, attitude, internet banking, TRA, TAM, TPB, DI

Procedia PDF Downloads 449
8068 The Impacts of Technology on Operations Costs: The Mediating Role of Operation Flexibility

Authors: Fazli Idris, Jihad Mohammad

Abstract:

The study aims to determine the impact of technology and service operations flexibility, which is divided into external flexibility and internal robustness, on operations costs. A mediation model is proposed that links technology to operations costs via operation flexibility. Drawing on a sample of 475 of operations managers of various service sectors in Malaysia and South Africa, Structural Equation Modeling (SEM) was employed to test the relationship using Smart-PLS procedures. It was found that a significant relationship was established between technologies to operations costs via both operations flexibility dimensions. Theoretical and managerial implications are offered to explain the results.

Keywords: Operations flexibility, technology, costs, mediation

Procedia PDF Downloads 589
8067 An Experimental Study of Diffuser-Enhanced Propeller Hydrokinetic Turbines

Authors: Matheus Nunes, Rafael Mendes, Taygoara Felamingo Oliveira, Antonio Brasil Junior

Abstract:

Wind tunnel experiments of horizontal axis propeller hydrokinetic turbines model were carried out, in order to determine the performance behavior for different configurations and operational range. The present experiments introduce the use of two different geometries of rear diffusers to enhance the performance of the free flow machine. The present paper reports an increase of the power coefficient about 50%-80%. It represents an important feature that has to be taken into account in the design of this kind of machine.

Keywords: diffuser-enhanced turbines, hydrokinetic turbine, wind tunnel experiments, micro hydro

Procedia PDF Downloads 241
8066 Realization and Characterization of TiN Coating and Metal Working Application

Authors: Nadjette Belhamra, Abdelouahed Chala, Ibrahim Guasmi

Abstract:

Titanium nitride coatings have been extensively used in industry, such as in cutting tools. TiN coating were deposited by chemical vapour deposition (CVD) on carbide insert at a temperature between 850°C and 1100°C, which often exceeds the hardening treatment temperature of the metals. The objective of this work is to realize, to characterize of TiN coating and to apply it in the turning of steel 42CrMo4 under lubrification. Various experimental techniques were employed for the microstructural characterization of the coatings, e. g., X-ray diffraction (XRD), scanning electron microscope (SEM) model JOEL JSM-5900 LV, equipped with energy dispersive X-ray (EDX). The results show that TiN-coated demonstrate a good wear resistance.

Keywords: hard coating TiN, carbide inserts, machining, turning, wear

Procedia PDF Downloads 529
8065 The Road Ahead: Merging Human Cyber Security Expertise with Generative AI

Authors: Brennan Lodge

Abstract:

Amidst a complex regulatory landscape, Retrieval Augmented Generation (RAG) emerges as a transformative tool for Governance Risk and Compliance (GRC) officers. This paper details the application of RAG in synthesizing Large Language Models (LLMs) with external knowledge bases, offering GRC professionals an advanced means to adapt to rapid changes in compliance requirements. While the development for standalone LLM’s (Large Language Models) is exciting, such models do have their downsides. LLM’s cannot easily expand or revise their memory, and they can’t straightforwardly provide insight into their predictions, and may produce “hallucinations.” Leveraging a pre-trained seq2seq transformer and a dense vector index of domain-specific data, this approach integrates real-time data retrieval into the generative process, enabling gap analysis and the dynamic generation of compliance and risk management content. We delve into the mechanics of RAG, focusing on its dual structure that pairs parametric knowledge contained within the transformer model with non-parametric data extracted from an updatable corpus. This hybrid model enhances decision-making through context-rich insights, drawing from the most current and relevant information, thereby enabling GRC officers to maintain a proactive compliance stance. Our methodology aligns with the latest advances in neural network fine-tuning, providing a granular, token-level application of retrieved information to inform and generate compliance narratives. By employing RAG, we exhibit a scalable solution that can adapt to novel regulatory challenges and cybersecurity threats, offering GRC officers a robust, predictive tool that augments their expertise. The granular application of RAG’s dual structure not only improves compliance and risk management protocols but also informs the development of compliance narratives with pinpoint accuracy. It underscores AI’s emerging role in strategic risk mitigation and proactive policy formation, positioning GRC officers to anticipate and navigate the complexities of regulatory evolution confidently.

Keywords: cybersecurity, gen AI, retrieval augmented generation, cybersecurity defense strategies

Procedia PDF Downloads 61
8064 Laser Cooling of Internal Degrees of Freedom of Molecules: Cesium Case

Authors: R. Horchani

Abstract:

Optical pumping technique with laser fields combined with photo-association of ultra-cold atoms leads to control on demand the vibrational and/or the rotational population of molecules. Here, we review the basic concepts and main steps should be followed, including the excitation schemes and detection techniques we use to achieve the ro-vibrational cooling of Cs2 molecules. We also discuss the extension of this technique to other molecules. In addition, we present a theoretical model used to support the experiment. These simulations can be widely used for the preparation of various experiments since they allow the optimization of several important experimental parameters.

Keywords: cold molecule, photo-association, optical pumping, vibrational and rotational cooling

Procedia PDF Downloads 271
8063 The Effects of Cultural Distance and Institutions on Foreign Direct Investment Choices: Evidence from Turkey and China

Authors: Nihal Kartaltepe Behram, Göksel Ataman, Dila Okçu

Abstract:

With the development of foreign direct investments, the social, cultural, political and economic interactions between countries and institutions have become visible and they have become determining factors for the strategic structuring and market goals. In this context the purpose of this study is to investigate the effects of cultural distance and institutions on foreign direct investment choices in terms of location and investment model. For international establishments, the concept of culture, as well as the concept of cultural distance, is taken specifically into consideration, especially in the selection of methods for entering the market. In the researches and empirical studies conducted, a direct relationship between cultural distance and foreign direct investments is set and institutions and effective variable factors are examined at the level of defining the investment types. When the detailed calculation strategies and empirical researches and studies are taken into consideration, the most common methods for determining the direct investment model, considering the cultural distances, are full-ownership enterprises and joint ventures. Also, when all of the factors affecting the investments are taken into consideration, it was seen that the effect of institutions such as Government Intervention, Intellectual Property Rights, Corruption and Contract Enforcements is very important. Furthermore agglomeration is more intense and effective on the investment, compared to other factors. China has been selected as the target country, due to its effectiveness in world economy and its contributions to developing countries, which has commercial relationships with. Qualitative research methods are used for this study conducted, to measure the effects of determinative variable factors in the hypotheses of study, on the direct foreign investors and to evaluate the findings. In this study in-depth interview is used as a data collection method and the data analysis is made through descriptive analysis. Foreign Direct Investments are so reactive to institutions and cultural distance is identified by all interviews and analysis. On the other hand, agglomeration is the most strong determiner factor on foreign direct investors in Chinese Market. The reason of this factors, which comprise the sectorial aggregate, are not the strongest factors as agglomeration that the most important finding. We expect that this study became a beneficial guideline for developed and developing countries and local and national institutions’ strategic plans.

Keywords: China, cultural distance, Foreign Direct Investments, institutions

Procedia PDF Downloads 392
8062 Application of Shore Protective Structures in Optimum Land Using of Defense Sites Located in Coastal Cities

Authors: Mir Ahmad Lashteh Neshaei, Hamed Afsoos Biria, Ata Ghabraei, Mir Abdolhamid Mehrdad

Abstract:

Awareness of effective land using issues in coastal area including protection of natural ecosystems and coastal environment due to the increasing of human life along the coast is of great importance. There are numerous valuable structures and heritages which are located in defence sites and waterfront area. Marine structures such as groins, sea walls and detached breakwaters are constructed in coast to improve the coast stability against bed erosion due to changing wave and climate pattern. Marine mechanisms and interaction with the shore protection structures need to be intensively studied. Groins are one of the most prominent structures that are used in shore protection to create a safe environment for coastal area by maintaining the land against progressive coastal erosion. The main structural function of a groin is to control the long shore current and littoral sediment transport. This structure can be submerged and provide the necessary beach protection without negative environmental impact. However, for submerged structures adopted for beach protection, the shoreline response to these structures is not well understood at present. Nowadays, modelling and computer simulation are used to assess beach morphology in the vicinity of marine structures to reduce their environmental impact. The objective of this study is to predict the beach morphology in the vicinity of submerged groins and comparison with non-submerged groins with focus on a part of the coast located in Dahane sar Sefidrood, Guilan province, Iran where serious coast erosion has occurred recently. The simulations were obtained using a one-line model which can be used as a first approximation of shoreline prediction in the vicinity of groins. The results of the proposed model are compared with field measurements to determine the shape of the coast. Finally, the results of the present study show that using submerged groins can have a good efficiency to control the beach erosion without causing severe environmental impact to the coast. The important outcome from this study can be employed in optimum designing of defence sites in the coastal cities to improve their efficiency in terms of re-using the heritage lands.

Keywords: submerged structures, groin, shore protective structures, coastal cities

Procedia PDF Downloads 294
8061 MAOD Is Estimated by Sum of Contributions

Authors: David W. Hill, Linda W. Glass, Jakob L. Vingren

Abstract:

Maximal accumulated oxygen deficit (MAOD), the gold standard measure of anaerobic capacity, is the difference between the oxygen cost of exhaustive severe intensity exercise and the accumulated oxygen consumption (O2; mL·kg–1). In theory, MAOD can be estimated as the sum of independent estimates of the phosphocreatine and glycolysis contributions, which we refer to as PCr+glycolysis. Purpose: The purpose was to test the hypothesis that PCr+glycolysis provides a valid measure of anaerobic capacity in cycling and running. Methods: The participants were 27 women (mean ± SD, age 22 ±1 y, height 165 ± 7 cm, weight 63.4 ± 9.7 kg) and 25 men (age 22 ± 1 y, height 179 ± 6 cm, weight 80.8 ± 14.8 kg). They performed two exhaustive cycling and running tests, at speeds and work rates that were tolerable for ~5 min. The rate of oxygen consumption (VO2; mL·kg–1·min–1) was measured in warmups, in the tests, and during 7 min of recovery. Fingerprick blood samples obtained after exercise were analysed to determine peak blood lactate concentration (PeakLac). The VO2 response in exercise was fitted to a model, with a fast ‘primary’ phase followed by a delayed ‘slow’ component, from which was calculated the accumulated O2 and the excess O2 attributable to the slow component. The VO2 response in recovery was fitted to a model with a fast phase and slow component, sharing a common time delay. Oxygen demand (in mL·kg–1·min–1) was determined by extrapolation from steady-state VO2 in warmups; the total oxygen cost (in mL·kg–1) was determined by multiplying this demand by time to exhaustion and adding the excess O2; then, MAOD was calculated as total oxygen cost minus accumulated O2. The phosphocreatine contribution (area under the fast phase of the post-exercise VO2) and the glycolytic contribution (converted from PeakLac) were summed to give PCr+glycolysis. There was not an interaction effect involving sex, so values for anaerobic capacity were examined using a two-way ANOVA, with repeated measures across method (PCr+glycolysis vs MAOD) and mode (cycling vs running). Results: There was a significant effect only for exercise mode. There was no difference between MAOD and PCr+glycolysis: values were 59 ± 6 mL·kg–1 and 61 ± 8 mL·kg–1 in cycling and 78 ± 7 mL·kg–1 and 75 ± 8 mL·kg–1 in running. Discussion: PCr+glycolysis is a valid measure of anaerobic capacity in cycling and running, and it is as valid for women as for men.

Keywords: alactic, anaerobic, cycling, ergometer, glycolysis, lactic, lactate, oxygen deficit, phosphocreatine, running, treadmill

Procedia PDF Downloads 111
8060 Dense and Quality Urban Living: A Comparative Study on Architectural Solutions in the European City

Authors: Flavia Magliacani

Abstract:

The urbanization of the last decades and its resulting urban growth entail problems both for environmental and economic sustainability. From this perspective, sustainable settlement development requires a horizontal decrease in the existing urban structure in order to enhance its greater concentration. Hence, new stratifications of the city fabric and architectural strategies ensuring high-density settlement models are possible solutions. However, although increasing housing density is necessary, it is not sufficient. Guaranteeing the quality of living is, indeed, equally essential. In order to meet this objective, many other factors come to light, namely the relationship between private and public spaces, the proximity to services, the accessibility of public transport, the local lifestyle habits, and the social needs. Therefore, how to safeguard both quality and density in human habitats? The present paper attempts to answer the previous main research question by addressing several sub-questions: Which architectural types meet the dual need for urban density and housing quality? Which project criteria should be taken into consideration by good design practices? What principles are desirable for future planning? The research will analyse different architectural responses adopted in four European cities: Paris, Lion, Rotterdam, and Amsterdam. In particular, it will develop a qualitative and comparative study of two specific architectural solutions which integrate housing density and quality living. On the one hand, the so-called 'self-contained city' model, on the other hand, the French 'Habitat Dense Individualisé' one. The structure of the paper will be as follows: the first part will develop a qualitative evaluation of some case studies, emblematic examples of the two above said architectural models. The second part will focus on the comparison among the chosen case studies. Finally, some conclusions will be drawn. The methodological approach, therefore, combines qualitative and comparative research. Parameters will be defined in order to highlight potential and criticality of each model in light of an interdisciplinary view. In conclusion, the present paper aims at shading light on design approaches which ensure a right balance between density and quality of the urban living in contemporary European cities.

Keywords: density, future design, housing quality, human habitat

Procedia PDF Downloads 84
8059 Multicellular Cancer Spheroids as an in Vitro Model for Localized Hyperthermia Study

Authors: Kamila Dus-Szachniewicz, Artur Bednarkiewicz, Katarzyna Gdesz-Birula, Slawomir Drobczynski

Abstract:

In modern oncology hyperthermia (HT) is defined as a controlled tumor heating. HT treatment temperatures range between 40–48 °C and can selectively damage heat-sensitive cancer cells or limit their further growth, usually with minimal injury to healthy tissues. Despite many advantages, conventional whole-body and regional hyperthermia have clinically relevant side effects, including cardiac and vascular disorders. Additionally, the lack of accessibility of deep-seated tumor sites and impaired targeting micrometastases renders HT less effective. It is believed that above disadvantages can significantly overcome by the application of biofunctionalized microparticles, which can specifically target tumor sites and become activated by an external stimulus to provide a sufficient cellular response. In our research, the unique optical tweezers system have enabled capturing the silica microparticles, primary cells and tumor spheroids in highly controllable and reproducible environment to study the impact of localized heat stimulation on normal and pathological cell and within multicellular tumor spheroid. High throughput spheroid model was introduced to better mimic the response to HT treatment on tumors in vivo. Additionally, application of local heating of tumor spheroids was performed in strictly controlled conditions resembling tumor microenvironment (temperature, pH, hypoxia, etc.), in response to localized and nonhomogeneous hyperthermia in the extracellular matrix, which promotes tumor progression and metastatic spread. The lack of precise control over these well- defined parameters in basic research leads to discrepancies in the response of tumor cells to the new treatment strategy in preclinical animal testing. The developed approach enables also sorting out subclasses of cells, which exhibit partial or total resistance to therapy, in order to understand fundamental aspects of the resistance shown by given tumor cells in response to given therapy mode and conditions. This work was funded by the National Science Centre (NCN, Poland) under grant no. UMO-2017/27/B/ST7/01255.

Keywords: cancer spheroids, hyperthermia, microparticles, optical tweezers

Procedia PDF Downloads 109
8058 A Multilevel Approach for Stroke Prediction Combining Risk Factors and Retinal Images

Authors: Jeena R. S., Sukesh Kumar A.

Abstract:

Stroke is one of the major reasons of adult disability and morbidity in many of the developing countries like India. Early diagnosis of stroke is essential for timely prevention and cure. Various conventional statistical methods and computational intelligent models have been developed for predicting the risk and outcome of stroke. This research work focuses on a multilevel approach for predicting the occurrence of stroke based on various risk factors and invasive techniques like retinal imaging. This risk prediction model can aid in clinical decision making and help patients to have an improved and reliable risk prediction.

Keywords: prediction, retinal imaging, risk factors, stroke

Procedia PDF Downloads 276
8057 Two Kinds of Self-Oscillating Circuits Mechanically Demonstrated

Authors: Shiang-Hwua Yu, Po-Hsun Wu

Abstract:

This study introduces two types of self-oscillating circuits that are frequently found in power electronics applications. Special effort is made to relate the circuits to the analogous mechanical systems of some important scientific inventions: Galileo’s pendulum clock and Coulomb’s friction model. A little touch of related history and philosophy of science will hopefully encourage curiosity, advance the understanding of self-oscillating systems and satisfy the aspiration of some students for scientific literacy. Finally, the two self-oscillating circuits are applied to design a simple class-D audio amplifier.

Keywords: self-oscillation, sigma-delta modulator, pendulum clock, Coulomb friction, class-D amplifier

Procedia PDF Downloads 331
8056 Induction Machine Bearing Failure Detection Using Advanced Signal Processing Methods

Authors: Abdelghani Chahmi

Abstract:

This article examines the detection and localization of faults in electrical systems, particularly those using asynchronous machines. First, the process of failure will be characterized, relevant symptoms will be defined and based on those processes and symptoms, a model of those malfunctions will be obtained. Second, the development of the diagnosis of the machine will be shown. As studies of malfunctions in electrical systems could only rely on a small amount of experimental data, it has been essential to provide ourselves with simulation tools which allowed us to characterize the faulty behavior. Fault detection uses signal processing techniques in known operating phases.

Keywords: induction motor, modeling, bearing damage, airgap eccentricity, torque variation

Procedia PDF Downloads 115
8055 The Implementation of Inclusive Education in Collaboration between Teachers of Special Education Classes and Regular Classes in a Preschool

Authors: Chiou-Shiue Ko

Abstract:

As is explicitly stipulated in Article 7 of the Enforcement Rules of the Special Education Act as amended in 1998, "in principle, children with disabilities should be integrated with normal children for preschool education". Since then, all cities and counties have been committed to promoting preschool inclusive education. The Education Department, New Taipei City Government, has been actively recruiting advisory groups of professors to assist in the implementation of inclusive education in preschools since 2001. Since 2011, the author of this study has been guiding Preschool Rainbow to implement inclusive education. Through field observations, meetings, and teaching demonstration seminars, this study explored the process of how inclusive education has been successfully implemented in collaboration with teachers of special education classes and regular classes in Preschool Rainbow. The implementation phases for inclusive education in a single academic year include the following: 1) Preparatory stage. Prior to implementation, teachers in special education and regular classes discuss ways of conducting inclusive education and organize reading clubs to read books related to curriculum modifications that integrate the eight education strategies, early treatment and education, and early childhood education programs to enhance their capacity to implement and compose teaching plans for inclusive education. In addition to the general objectives of inclusive education, the objective of inclusive education for special children is also embedded into the Individualized Education Program (IEP). 2) Implementation stage. Initially, a promotional program for special education is implemented for the children to allow all the children in the preschool to understand their own special qualities and those of special children. After the implementation of three weeks of reverse inclusion, the children in the special education classes are put into groups and enter the regular classes twice a week to implement adjustments to their inclusion in the learning area and the curriculum. In 2013, further cooperation was carried out with adjacent hospitals to perform development screening activities for the early detection of children with developmental delays. 3) Review and reflection stage. After the implementation of inclusive education, all teachers in the preschool are divided into two groups to record their teaching plans and the lessons learned during implementation. The effectiveness of implementing the objective of inclusive education is also reviewed. With the collaboration of all teachers, in 2015, Preschool Rainbow won New Taipei City’s “Preschool Light” award as an exceptional model for inclusive education. Its model of implementing inclusive education can be used as a reference for other preschools.

Keywords: collaboration, inclusive education, preschool, teachers, special education classes, regular classes

Procedia PDF Downloads 398
8054 Coffee Consumption Has No Acute Effects on Glucose Metabolism in Healthy Men: A Randomized Crossover Clinical Trial

Authors: Caio E. G. Reis, Sara Wassell, Adriana L. Porto, Angélica A. Amato, Leslie J. C. Bluck, Teresa H. M. da Costa

Abstract:

Background: Multiple epidemiologic studies have consistently reported association between increased coffee consumption and a lowered risk of Type 2 Diabetes Mellitus. However, the mechanisms behind this finding have not been fully elucidated. Objective: We investigate the effect of coffee (caffeinated and decaffeinated) on glucose effectiveness and insulin sensitivity using the stable isotope minimal model protocol with oral glucose administration in healthy men. Design: Fifteen healthy men underwent 5 arms randomized crossover single-blinding (researchers) clinical trial. They consumed decaffeinated coffee, caffeinated coffee (with and without sugar), and controls – water (with and without sugar) followed 1 hour by an oral glucose tolerance test (75 g of available carbohydrate) with intravenous labeled dosing interpreted by the two compartment minimal model (225 minutes). One-way ANOVA with Bonferroni adjustment were used to compare the effects of the tested beverages on glucose metabolism parameters. Results: Decaffeinated coffee resulted in 29% and 85% higher insulin sensitivity compared with caffeinated coffee and water, respectively, and the caffeinated coffee showed 15% and 60% higher glucose effectiveness compared with decaffeinated coffee and water, respectively. However, these differences were not significant (p > 0.10). In overall analyze (0 – 225 min) there were no significant differences on glucose effectiveness, insulin sensitivity, and glucose and insulin area under the curve between the groups. The beneficial effects of coffee did not seem to act in the short-term (hours) on glucose metabolism parameters mainly on insulin sensitivity indices. The benefits of coffee consumption occur in the long-term (years) as has been shown in the reduction of Type 2 Diabetes Mellitus risk in epidemiological studies. The clinical relevance of the present findings is that there is no need to avoid coffee as the drink choice for healthy people. Conclusions: The findings of this study demonstrate that the consumption of caffeinated and decaffeinated coffee with or without sugar has no acute effects on glucose metabolism in healthy men. Further researches, including long-term interventional studies, are needed to fully elucidate the mechanisms behind the coffee effects on reduced risk for Type 2 Diabetes Mellitus.

Keywords: coffee, diabetes mellitus type 2, glucose, insulin

Procedia PDF Downloads 409
8053 Universality and Synchronization in Complex Quadratic Networks

Authors: Anca Radulescu, Danae Evans

Abstract:

The relationship between a network’s hardwiring and its emergent dynamics are central to neuroscience. We study the principles of this correspondence in a canonical setup (in which network nodes exhibit well-studied complex quadratic dynamics), then test their universality in biological networks. By extending methods from discrete dynamics, we study the effects of network connectivity on temporal patterns, encapsulating long-term behavior into the rich topology of network Mandelbrot sets. Then elements of fractal geometry can be used to predict and classify network behavior.

Keywords: canonical model, complex dynamics, dynamic networks, fractals, Mandelbrot set, network connectivity

Procedia PDF Downloads 284
8052 Modelling Strategy Planning in Multi Business Companies

Authors: Gelareh Changizi, Mahsa Khajavi, Ladan Shahhosseini

Abstract:

Corporate-level strategy, or simply ‘parent strategy’, is a topic that has received much attention since the very early days of the strategic planning field. Since the multi level enterprises have different sub enterprises which deal with different business environments, we cannot define the same strategic perspective for all of them. Therefore, the determination of a perspective to manage and deal with affiliates of such enterprises is the main challenge. The parent strategy in mother enterprises' level has been analyzed in this research. A case study has been carried to comprehensively describe the proposed model.

Keywords: parent strategy, multi-business companies, performance evaluation, lifecycle

Procedia PDF Downloads 341