Search results for: convergence and smoothness
162 The Theory of the Mystery: Unifying the Quantum and Cosmic Worlds
Authors: Md. Najiur Rahman
Abstract:
This hypothesis reveals a profound and symmetrical connection that goes beyond the boundaries of quantum physics and cosmology, revolutionizing our understanding of the fundamental building blocks of the cosmos, given its name ‘The Theory of the Mystery’. This theory has an elegantly simple equation, “R = ∆r / √∆m” which establishes a beautiful and well-crafted relationship between the radius (R) of an elementary particle or galaxy, the relative change in radius (∆r), and the mass difference (∆m) between related entities. It is fascinating to note that this formula presents a super synchronization, one which involves the convergence of every basic particle and any single celestial entity into perfect alignment with its respective mass and radius. In addition, we have a Supporting equation that defines the mass-radius connection of an entity by the equation: R=√m/N, where N is an empirically established constant, determined to be approximately 42.86 kg/m, representing the proportionality between mass and radius. It provides precise predictions, collects empirical evidence, and explores the far-reaching consequences of theories such as General Relativity. This elegant symmetry reveals a fundamental principle that underpins the cosmos: each component, whether small or large, follows a precise mass-radius relationship to exert gravity by a universal law. This hypothesis represents a transformative process towards a unified theory of physics, and the pursuit of experimental verification will show that each particle and galaxy is bound by gravity and plays a unique but harmonious role in shaping the universe. It promises to reveal the great symphony of the mighty cosmos. The predictive power of our hypothesis invites the exploration of entities at the farthest reaches of the cosmos, providing a bridge between the known and the unknown.Keywords: unified theory, quantum gravity, mass-radius relationship, dark matter, uniform gravity
Procedia PDF Downloads 103161 Commercial Management vs. Quantity Surveying: Hoax or Harmonization
Authors: Zelda Jansen Van Rensburg
Abstract:
Purpose: This study investigates the perceived disparities between Quantity Surveying and Commercial Management in the construction industry, questioning if these differences are substantive or merely semantic. It aims to challenge the conventional notion of Commercial Managers’ superiority by critically evaluating QS and CM roles, exploring CM integration possibilities, examining qualifications for aspiring Commercial Managers, assessing regulatory frameworks, and considering terminology redefinition for global QS professional enhancement. Design: Utilizing mixed methods like literature reviews, surveys, interviews, and document analyses, this research examines the QS-CM relationship. Insights from industry professionals, academics, and regulatory bodies inform the investigation into changing QS roles. Findings: Empirical data highlight evolving roles, showcasing areas of convergence and divergence between QSs and CM. Potential CM integration into QS practice and qualifications for aspiring Commercial Managers are identified. Limitations/Implications: Limitations include potential bias in self-reported data and findings. Nevertheless, the research informs future practices and educational approaches in QS and CM, reflecting the changing roles and responsibilities of Quantity Surveyors. Practical Implications: Findings inform industry practitioners, educators, and regulators, stressing the need to adapt to changing QS roles and integrate CM principles where applicable. Value to the Conference Theme: Aligned with ‘Evolving roles and responsibilities of Quantity Surveyors,’ this research offers insights crucial for understanding the changing dynamics within the QS profession and informs strategies to navigate these shifts effectively.Keywords: quantity surveying, commercial management, cost engineering, quantity survey
Procedia PDF Downloads 38160 Human-Computer Interaction: Strategies for Ensuring the Design of User-Centered Web Interfaces for Smartphones
Authors: Byron Joseph A. Hallar, Annjeannette Alain D. Galang, Maria Visitacion N. Gumabay
Abstract:
The widespread adoption and increasing proliferation of smartphones that started during the first decade of the twenty-first century have enabled their users to communicate and access information in ways that were merely thought of as possibilities in the few years before the smartphone revolution. A product of the convergence of the cellular phone and portable computer, the smartphone provides an additional important function that used to be the exclusive domain of desktop-bound computers and portable computers: Web Browsing. For increasing numbers of users, the smartphone and allied devices such as tablet computers have become their first and often their only means of accessing the World Wide Web. This has led to the development of websites that cater to the needs of the new breed of smartphone-carrying web users. The smaller size of smartphones as compared with conventional computers has provided unique challenges to web interface designers. The smaller screen size and touchscreen interface have made it much more difficult to read and navigate through web pages that were in most part designed for traditional desktop and portable computers. Although increasing numbers of websites now provide an alternate website formatted for smartphones, problems with ease of use, reliability and usability still remain. This study focuses on the identification of the problems associated with smartphone web interfaces, the compliance with accepted standards of user-oriented web interface design, the strategies that could be utilized to ensure the design of user-centric web interfaces for smartphones, and the identification of the current trends and developments related to user-centric web interface design intended for the consumption of smartphone users.Keywords: human-computer interaction, user-centered design, web interface, mobile, smartphone
Procedia PDF Downloads 356159 [Keynote Talk]: Some Underlying Factors and Partial Solutions to the Global Water Crisis
Authors: Emery Jr. Coppola
Abstract:
Water resources are being depleted and degraded at an alarming and non-sustainable rate worldwide. In some areas, it is progressing more slowly. In other areas, irreversible damage has already occurred, rendering regions largely unsuitable for human existence with destruction of the environment and the economy. Today, 2.5 billion people or 36 percent of the world population live in water-stressed areas. The convergence of factors that created this global water crisis includes local, regional, and global failures. In this paper, a survey of some of these factors is presented. They include abuse of political power and regulatory acquiescence, improper planning and design, ignoring good science and models, systemic failures, and division between the powerful and the powerless. Increasing water demand imposed by exploding human populations and growing economies with short-falls exacerbated by climate change and continuing water quality degradation will accelerate this growing water crisis in many areas. Without regional measures to improve water efficiencies and protect dwindling and vulnerable water resources, environmental and economic displacement of populations and conflict over water resources will only grow. Perhaps more challenging, a global commitment is necessary to curtail if not reverse the devastating effects of climate change. Factors will be illustrated by real-world examples, followed by some partial solutions offered by water experts for helping to mitigate the growing water crisis. These solutions include more water efficient technologies, education and incentivization for water conservation, wastewater treatment for reuse, and improved data collection and utilization.Keywords: climate change, water conservation, water crisis, water technologies
Procedia PDF Downloads 233158 Lexical Collocations in Medical Articles of Non-Native vs Native English-Speaking Researchers
Authors: Waleed Mandour
Abstract:
This study presents multidimensional scrutiny of Benson et al.’s seven-category taxonomy of lexical collocations used by Egyptian medical authors and their peers of native-English speakers. It investigates 212 medical papers, all published during a span of 6 years (from 2013 to 2018). The comparison is held to the medical research articles submitted by native speakers of English (25,238 articles in total with over 103 million words) as derived from the Directory of Open Access Journals (a 2.7 billion-word corpus). The non-native speakers compiled corpus was properly annotated and marked-up manually by the researcher according to the standards of Weisser. In terms of statistical comparisons, though, deployed were the conventional frequency-based analysis besides the relevant criteria, such as association measures (AMs) in which LogDice is deployed as per the recommendation of Kilgariff et al. when comparing large corpora. Despite the terminological convergence in the subject corpora, comparison results confirm the previous literature of which the non-native speakers’ compositions reveal limited ranges of lexical collocations in terms of their distribution. However, there is a ubiquitous tendency of overusing the NS-high-frequency multi-words in all lexical categories investigated. Furthermore, Egyptian authors, conversely to their English-speaking peers, tend to embrace more collocations denoting quantitative rather than qualitative analyses in their produced papers. This empirical work, per se, contributes to the English for Academic Purposes (EAP) and English as a Lingua Franca in Academic settings (ELFA). In addition, there are pedagogical implications that would promote a better quality of medical research papers published in Egyptian universities.Keywords: corpus linguistics, EAP, ELFA, lexical collocations, medical discourse
Procedia PDF Downloads 129157 Modelling of Pipe Jacked Twin Tunnels in a Very Soft Clay
Authors: Hojjat Mohammadi, Randall Divito, Gary J. E. Kramer
Abstract:
Tunnelling and pipe jacking in very soft soils (fat clays), even with an Earth Pressure Balance tunnel boring machine (EPBM), can cause large ground displacements. In this study, the short-term and long-term ground and tunnel response is predicted for twin, pipe-jacked EPBM 3 meter diameter tunnels with a narrow pillar width. Initial modelling indicated complete closure of the annulus gap at the tail shield onto the centrifugally cast, glass-fiber-reinforced, polymer mortar jacking pipe (FRP). Numerical modelling was employed to simulate the excavation and support installation sequence, examine the ground response during excavation, confirm the adequacy of the pillar width and check the structural adequacy of the installed pipe. In the numerical models, Mohr-Coulomb constitutive model with the effect of unloading was adopted for the fat clays, while for the bedrock layer, the generalized Hoek-Brown was employed. The numerical models considered explicit excavation sequences and different levels of ground convergence prior to support installation. The well-studied excavation sequences made the analysis possible for this study on a very soft clay, otherwise, obtaining the convergency in the numerical analysis would be impossible. The predicted results indicate that the ground displacements around the tunnel and its effect on the pipe would be acceptable despite predictions of large zones of plastic behaviour around the tunnels and within the entire pillar between them due to excavation-induced ground movements.Keywords: finite element modeling (FEM), pipe-jacked tunneling, very soft clay, EPBM
Procedia PDF Downloads 80156 Development and Validation of Family Outcome Survey – Revised Taiwan Version
Authors: Shih-Heng Sun, Hsiu-Yu Chang
Abstract:
“Family centered service model” becomes mainstream in early intervention. Family outcome should be evaluated in addition child improvement in terms of outcome evaluation in early intervention. The purpose of this study is to develop a surveys to evaluate family outcomes in early intervention. Method: “Family Outcomes Survey- Revised Taiwan Version” (FOS-RT) was developed through translation, back-translation, and review by the original author. Expert meeting was held to determine the content validity. Two hundred and eighty six parent-child dyads recruited from 10 local Early Intervention Resource Centers (EIRC) participated in the study after they signed inform consent. The results showed both parts of FOS-RT exhibits good internal consistency and test-retest reliability. The result of confirmatory factor analysis indicated moderate fit of 5 factor structure of part A and 3 factor structure of part B of FOS-RT. The correlation between different sessions reached moderate to high level reveals some sessions measure similar latent trait of family outcomes. Correlation between FOS-RT and Parents‘ Perceived Parenting Skills Questionnaire was calculated to determine the convergence validity. The moderate correlation indicates the two assessments measure different parts of early intervention outcome although both assessments have similar sub-scales. The results of this study support FOS-RT is a valid and reliable tool to evaluate family outcome after the family and children with developmental disability receive early intervention services.Keywords: early intervention, family service, outcome evaluation, parenting skills, family centered
Procedia PDF Downloads 505155 Digitizing Masterpieces in Italian Museums: Techniques, Challenges and Consequences from Giotto to Caravaggio
Authors: Ginevra Addis
Abstract:
The possibility of reproducing physical artifacts in a digital format is one of the opportunities offered by the technological advancements in information and communication most frequently promoted by museums. Indeed, the study and conservation of our cultural heritage have seen significant advancement due to the three-dimensional acquisition and modeling technology. A variety of laser scanning systems has been developed, based either on optical triangulation or on time-of-flight measurement, capable of producing digital 3D images of complex structures with high resolution and accuracy. It is necessary, however, to explore the challenges and opportunities that this practice brings within museums. The purpose of this paper is to understand what change is introduced by digital techniques in those museums that are hosting digital masterpieces. The methodology used will investigate three distinguished Italian exhibitions, related to the territory of Milan, trying to analyze the following issues about museum practices: 1) how digitizing art masterpieces increases the number of visitors; 2) what the need that calls for the digitization of artworks; 3) which techniques are most used; 4) what the setting is; 5) the consequences of a non-publication of hard copies of catalogues; 6) envision of these practices in the future. Findings will show how interconnection plays an important role in rebuilding a collection spread all over the world. Secondly how digital artwork duplication and extension of reality entail new forms of accessibility. Thirdly, that collection and preservation through digitization of images have both a social and educational mission. Fourthly, that convergence of the properties of different media (such as web, radio) is key to encourage people to get actively involved in digital exhibitions. The present analysis will suggest further research that should create museum models and interaction spaces that act as catalysts for innovation.Keywords: digital masterpieces, education, interconnection, Italian museums, preservation
Procedia PDF Downloads 174154 An Exploration of Cross-culture Consumer Behaviour - The Characteristics of Chinese Consumers’ Decision Making in Europe
Authors: Yongsheng Guo, Xiaoxian Zhu, Mandella Osei-Assibey Bonsu
Abstract:
This study explores the effects of national culture on consumer behaviour by identifying the characteristics of Chinese consumers’ decision making in Europe. It offers a better understanding of how cultural factors affect consumers’ behaviour, and how consumers make decisions in other nations with different culture. It adopted a grounded theory approach and conducted twenty-four in-depth interviews. Grounded theory models are developed to link the causal conditions, process and consequences. Results reveal that some cultural factors including conservatism, emotionality, acquaintance community, long-term orientation and principles affect Chinese consumers when making purchase decisions in Europe. Most Chinese consumers plan and prepare their expenditure and stay in Europe as cultural learners, and purchase durable products or assets as investment, and share their experiences within a community. This study identified potential problems such as political and social environment, complex procedures, and restrictions. This study found that external factors influence on internal factors and then internal characters determine consumer behaviour. This study proposes that cultural traits developed in convergence evolution through social selection and Chinese consumers persist most characters but adapt some perceptions and actions overtime in other countries. This study suggests that cultural marketing could be adopted by companies to reflect consumers’ preferences. Agencies, shops, and the authorities could take actions to reduce the complexity and restrictions.Keywords: national culture, consumer behaviour, decision making, cultural marketing
Procedia PDF Downloads 93153 Acceptability of the Carers-ID Intervention for Family Carers of People with Intellectual Disabilities
Authors: Mark Linden, Michael Brown, Lynne Marsh, Maria Truesdale, Stuart Todd, Nathan Hughes, Trisha Forbes, Rachel Leonard
Abstract:
Background: Family carers of people with intellectual disabilities (ID) face ongoing challenges in accessing services and often experience poor mental health. Online support programmes may prove effective in addressing the mental health and well-being needs of family carers. This study sought to test the acceptability of a newly developed online support programme for carers of people with intellectual disabilities called Carers-ID. Methods A sequential mixed-methods explanatory design was utilised. An adapted version of the Acceptability of Health Apps among Adolescents (AHAA) Scale was distributed to family carers who had viewed the Carers-ID.com intervention. Following this, participants were invited to take part in an online interview. Interview questions focused on participants’ experiences of using the programme and its acceptability. Qualitative and quantitative data were analysed separately and then brought together through the triangulation protocol developed by Farmer et al (2006). Findings: Seventy family carers responded to the acceptability survey, whilst 10 took part in interviews. Six themes were generated from interviews with family carers. Based on our triangulation, four areas of convergence were identified, these included, programme usability and ease, attitudes towards the programme, perceptions of effectiveness, and programme relatability. Conclusions: In order to be acceptable, online interventions for carers of people with ID need to be accessible, understandable and easy to use, as carers time is precious. Further research is needed to investigate the effectiveness of online interventions for family carers, specifically considering which carers the intervention works for, and for whom it may not.Keywords: intellectual disability, family carer, acceptability study, online intervention
Procedia PDF Downloads 90152 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model
Authors: Can Huang, Xiaoliang Wang, Qingquan Liu
Abstract:
Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH
Procedia PDF Downloads 63151 Morpho-Syntactic Pattern in Maithili Urdu
Authors: Mohammad Jahangeer Warsi
Abstract:
This is, perhaps, the first linguistic study of Maithili Urdu, a dialect of Urdu language of Indo-Aryan family, spoken by around four million speakers in Darbhanga, Samastipur, Begusarai, Madhubani, and Muzafarpur districts of Bihar. It has the subject–verb–object (SOV) word order and it lacks script and literature. Needless to say, this work is an attempt to document this dialect so that it should contribute to the field of descriptive linguistics. Besides, it is also spoken by majority of Maithili diaspora community. Maithili Urdu does not have its own script or literature, yet it has maintained an oral history of over many centuries. It has contributed to enriching the Maithili, Hindi and Urdu languages and literature very profoundly. Dialects are the contact languages of particular regions, and they have a deep impact on their cultural heritage. Slowly with time, these dialects begin to take shape of languages. The convergence of a dialect into a language is a symbol and pride of the people who speak it. Although, confined to the five districts of northern Bihar, yet highly popular among the natives, it is the primary mode of communication of the local Muslims. The paper will focus on the structure of expressions about Maithili Urdu that include the structure of words, phrases, clauses, and sentences. There are clear differences in linguistic features of Maithili Urdu vis-à-vis Urdu, Maithili and Hindi. Though being a dialect of Urdu, interestingly, there is only one second person pronoun tu and lack of agentive marker –ne. Although being spoken in the vicinity of Hindi, Urdu and Maithili, it undoubtedly has its own linguistic features, of them, verb conjugation is remarkably unique. Because of the oral tradition of this link language, intonation has become significantly prominent. This paper will discuss the morpho-syntactic pattern of Maithili Urdu and will go through a sample text to authenticate the findings.Keywords: cultural heritage, morpho-syntactic pattern, Maithili Urdu, verb conjugation
Procedia PDF Downloads 213150 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features
Authors: Bo Wang
Abstract:
The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection
Procedia PDF Downloads 283149 Screening Diversity: Artificial Intelligence and Virtual Reality Strategies for Elevating Endangered African Languages in the Film and Television Industry
Authors: Samuel Ntsanwisi
Abstract:
This study investigates the transformative role of Artificial Intelligence (AI) and Virtual Reality (VR) in the preservation of endangered African languages. The study is contextualized within the film and television industry, highlighting disparities in screen representation for certain languages in South Africa, underscoring the need for increased visibility and preservation efforts; with globalization and cultural shifts posing significant threats to linguistic diversity, this research explores approaches to language preservation. By leveraging AI technologies, such as speech recognition, translation, and adaptive learning applications, and integrating VR for immersive and interactive experiences, the study aims to create a framework for teaching and passing on endangered African languages. Through digital documentation, interactive language learning applications, storytelling, and community engagement, the research demonstrates how these technologies can empower communities to revitalize their linguistic heritage. This study employs a dual-method approach, combining a rigorous literature review to analyse existing research on the convergence of AI, VR, and language preservation with primary data collection through interviews and surveys with ten filmmakers. The literature review establishes a solid foundation for understanding the current landscape, while interviews with filmmakers provide crucial real-world insights, enriching the study's depth. This balanced methodology ensures a comprehensive exploration of the intersection between AI, VR, and language preservation, offering both theoretical insights and practical perspectives from industry professionals.Keywords: language preservation, endangered languages, artificial intelligence, virtual reality, interactive learning
Procedia PDF Downloads 59148 Real Fictions: Converging Landscapes and Imagination in an English Village
Authors: Edoardo Lomi
Abstract:
A problem of central interest in anthropology concerns the ethnographic displacement of modernity’s conceptual sovereignty over that of native collectives worldwide. Part of this critical project has been the association of Western modernity with a dualist, naturalist ontology. Despite its demonstrated value for comparative work, this association often comes at the cost of reproducing ideas that lack an empirical ethnographic basis. This paper proposes a way forward by bringing to bear some of the results produced by an ethnographic study of a village in Wiltshire, South England. Due to its picturesque qualities, this village has served for decades as a ready-made set for fantasy movies and a backdrop to fictional stories. These forms of mediation have in turn generated some apparent paradoxes, such as fictitious characters that affect actual material changes, films that become more real than history, and animated stories that, while requiring material grounds to unfold, inhabit a time and space in other respects distinct from that of material processes. Drawing on ongoing fieldwork and interviews with locals and tourists, this paper considers the ways villagers engage with fiction as part of their everyday lives. The resulting image is one of convergence, in the same landscape, of people and things having different ontological status. This study invites reflection on the implications of this image for diversifying our imagery of Western lifeworlds. To this end, the notion of ‘real fictions’ is put forth, connecting the ethnographic blurring of modernist distinctions–such as sign and signified, mind and matter, materiality and immateriality–with discussions on anthropology’s own reliance on fictions for critical comparative work.Keywords: England, ethnography, landscape, modernity, mediation, ontology, post-structural theory
Procedia PDF Downloads 121147 Iot-Based Interactive Patient Identification and Safety Management System
Authors: Jonghoon Chun, Insung Kim, Jonghyun Lim, Gun Ro
Abstract:
We believe that it is possible to provide a solution to reduce patient safety accidents by displaying correct medical records and prescription information through interactive patient identification. Our system is based on the use of smart bands worn by patients and these bands communicate with the hybrid gateways which understand both BLE and Wifi communication protocols. Through the convergence of low-power Bluetooth (BLE) and hybrid gateway technology, which is one of short-range wireless communication technologies, we implement ‘Intelligent Patient Identification and Location Tracking System’ to prevent medical malfunction frequently occurring in medical institutions. Based on big data and IOT technology using MongoDB, smart band (BLE, NFC function) and hybrid gateway, we develop a system to enable two-way communication between medical staff and hospitalized patients as well as to store locational information of the patients in minutes. Based on the precise information provided using big data systems, such as location tracking and movement of in-hospital patients wearing smart bands, our findings include the fact that a patient-specific location tracking algorithm can more efficiently operate HIS (Hospital Information System) and other related systems. Through the system, we can always correctly identify patients using identification tags. In addition, the system automatically determines whether the patient is a scheduled for medical service by the system in use at the medical institution, and displays the appropriateness of the medical treatment and the medical information (medical record and prescription information) on the screen and voice. This work was supported in part by the Korea Technology and Information Promotion Agency for SMEs (TIPA) grant funded by the Korean Small and Medium Business Administration (No. S2410390).Keywords: BLE, hybrid gateway, patient identification, IoT, safety management, smart band
Procedia PDF Downloads 309146 Applications of Evolutionary Optimization Methods in Reinforcement Learning
Authors: Rahul Paul, Kedar Nath Das
Abstract:
The paradigm of Reinforcement Learning (RL) has become prominent in training intelligent agents to make decisions in environments that are both dynamic and uncertain. The primary objective of RL is to optimize the policy of an agent in order to maximize the cumulative reward it receives throughout a given period. Nevertheless, the process of optimization presents notable difficulties as a result of the inherent trade-off between exploration and exploitation, the presence of extensive state-action spaces, and the intricate nature of the dynamics involved. Evolutionary Optimization Methods (EOMs) have garnered considerable attention as a supplementary approach to tackle these challenges, providing distinct capabilities for optimizing RL policies and value functions. The ongoing advancement of research in both RL and EOMs presents an opportunity for significant advancements in autonomous decision-making systems. The convergence of these two fields has the potential to have a transformative impact on various domains of artificial intelligence (AI) applications. This article highlights the considerable influence of EOMs in enhancing the capabilities of RL. Taking advantage of evolutionary principles enables RL algorithms to effectively traverse extensive action spaces and discover optimal solutions within intricate environments. Moreover, this paper emphasizes the practical implementations of EOMs in the field of RL, specifically in areas such as robotic control, autonomous systems, inventory problems, and multi-agent scenarios. The article highlights the utilization of EOMs in facilitating RL agents to effectively adapt, evolve, and uncover proficient strategies for complex tasks that may pose challenges for conventional RL approaches.Keywords: machine learning, reinforcement learning, loss function, optimization techniques, evolutionary optimization methods
Procedia PDF Downloads 79145 Numerical Modeling of Air Shock Wave Generated by Explosive Detonation and Dynamic Response of Structures
Authors: Michał Lidner, Zbigniew SzcześNiak
Abstract:
The ability to estimate blast load overpressure properly plays an important role in safety design of buildings. The issue of studying of blast loading on structural elements has been explored for many years. However, in many literature reports shock wave overpressure is estimated with simplified triangular or exponential distribution in time. This indicates some errors when comparing real and numerical reaction of elements. Nonetheless, it is possible to further improve setting similar to the real blast load overpressure function versus time. The paper presents a method of numerical analysis of the phenomenon of the air shock wave propagation. It uses Finite Volume Method and takes into account energy losses due to a heat transfer with respect to an adiabatic process rule. A system of three equations (conservation of mass, momentum and energy) describes the flow of a volume of gaseous medium in the area remote from building compartments, which can inhibit the movement of gas. For validation three cases of a shock wave flow were analyzed: a free field explosion, an explosion inside a steel insusceptible tube (the 1D case) and an explosion inside insusceptible cube (the 3D case). The results of numerical analysis were compared with the literature reports. Values of impulse, pressure, and its duration were studied. Finally, an overall good convergence of numerical results with experiments was achieved. Also the most important parameters were well reflected. Additionally analyses of dynamic response of one of considered structural element were made.Keywords: adiabatic process, air shock wave, explosive, finite volume method
Procedia PDF Downloads 190144 A Geometrical Multiscale Approach to Blood Flow Simulation: Coupling 2-D Navier-Stokes and 0-D Lumped Parameter Models
Authors: Azadeh Jafari, Robert G. Owens
Abstract:
In this study, a geometrical multiscale approach which means coupling together the 2-D Navier-Stokes equations, constitutive equations and 0-D lumped parameter models is investigated. A multiscale approach, suggest a natural way of coupling detailed local models (in the flow domain) with coarser models able to describe the dynamics over a large part or even the whole cardiovascular system at acceptable computational cost. In this study we introduce a new velocity correction scheme to decouple the velocity computation from the pressure one. To evaluate the capability of our new scheme, a comparison between the results obtained with Neumann outflow boundary conditions on the velocity and Dirichlet outflow boundary conditions on the pressure and those obtained using coupling with the lumped parameter model has been performed. Comprehensive studies have been done based on the sensitivity of numerical scheme to the initial conditions, elasticity and number of spectral modes. Improvement of the computational algorithm with stable convergence has been demonstrated for at least moderate Weissenberg number. We comment on mathematical properties of the reduced model, its limitations in yielding realistic and accurate numerical simulations, and its contribution to a better understanding of microvascular blood flow. We discuss the sophistication and reliability of multiscale models for computing correct boundary conditions at the outflow boundaries of a section of the cardiovascular system of interest. In this respect the geometrical multiscale approach can be regarded as a new method for solving a class of biofluids problems, whose application goes significantly beyond the one addressed in this work.Keywords: geometrical multiscale models, haemorheology model, coupled 2-D navier-stokes 0-D lumped parameter modeling, computational fluid dynamics
Procedia PDF Downloads 359143 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation
Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner
Abstract:
This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.Keywords: business valuation, corporate finance, digitisation, disruption
Procedia PDF Downloads 132142 English Language Proficiency and Use as Determinants of Transactional Success in Gbagi Market, Ibadan, Nigeria
Authors: A. Robbin
Abstract:
Language selection can be an efficient negotiation strategy employed by both service or product providers and their customers to achieve transactional success. The transactional scenario in Gbagi Market, Ibadan, Nigeria provides an appropriate setting for the exploration of the Nigerian multilingual situation with its own interesting linguistic peculiarities which questions the functionality of the ‘Lingua Franca’ in trade situations. This study examined English Language proficiency among Yoruba Traders in Gbagi Market, Ibadan and its use as determinants of transactional success during service encounters. Randomly selected Yoruba-English bilingual traders and customers were administered questionnaires and the data subjected to statistical and descriptive analysis using Giles Communication Accommodation Theory. Findings reveal that only fifty percent of the traders used for the study were proficient in speaking English language. Traders with minimal proficiency in Standard English, however, resulted in the use of the Nigerian Pidgin English. Both traders and customers select the Mother Tongue, which is the Yoruba Language during service encounters but are quick to converge to the other’s preferred language as the transactional exchange demands. The English language selection is not so much for the prestige or lingua franca status of the language as it is for its functions, which include ease of communication, negotiation, and increased sales. The use of English during service encounters is mostly determined by customer’s linguistic preference which the trader accommodates to for better negotiation and never as a first choice. This convergence is found to be beneficial as it ensures sales and return patronage. Although the English language is not a preferred code choice in Gbagi Market, it serves a functional trade strategy for transactional success during service encounters in the market.Keywords: communication accommodation theory, language selection, proficiency, service encounter, transaction
Procedia PDF Downloads 157141 Disparities Versus Similarities; WHO Good Practices for Pharmaceutical Quality Control Laboratories and ISO/IEC 17025:2017: International Standards for Quality Management Systems in Pharmaceutical Laboratories
Authors: Mercy Okezue, Kari Clase, Stephen Byrn, Paddy Shivanand
Abstract:
Medicines regulatory authorities expect pharmaceutical companies and contract research organizations to seek ways to certify that their laboratory control measurements are reliable. Establishing and maintaining laboratory quality standards are essential in ensuring the accuracy of test results. ‘ISO/IEC 17025:2017’ and ‘WHO Good Practices for Pharmaceutical Quality Control Laboratories (GPPQCL)’ are two quality standards commonly employed in developing laboratory quality systems. A review was conducted on the two standards to elaborate on areas on convergence and divergence. The goal was to understand how differences in each standard's requirements may influence laboratories' choices as to which document is easier to adopt for quality systems. A qualitative review method compared similar items in the two standards while mapping out areas where there were specific differences in the requirements of the two documents. The review also provided a detailed description of the clauses and parts covering management and technical requirements in these laboratory standards. The review showed that both documents share requirements for over ten critical areas covering objectives, infrastructure, management systems, and laboratory processes. There were, however, differences in standard expectations where GPPQCL emphasizes system procedures for planning and future budgets that will ensure continuity. Conversely, ISO 17025 was more focused on the risk management approach to establish laboratory quality systems. Elements in the two documents form common standard requirements to assure the validity of laboratory test results that promote mutual recognition. The ISO standard currently has more global patronage than GPPQCL.Keywords: ISO/IEC 17025:2017, laboratory standards, quality control, WHO GPPQCL
Procedia PDF Downloads 195140 The Structure of Southern Tunisian Atlas Deformation Front: Integrated Geological and Geophysical Interpretation
Authors: D. Manai, J. Alvarez-Marron, M. Inoubli
Abstract:
The southern Tunisian Atlas is a part of the wide Cenozoic intracontinental deformation that affected North Africa as a result of convergence between African and Eurasian plates. The Southern Tunisian Atlas Front (STAF) corresponds to the chotts area that covers several hundreds of Km² and represents a 60 km wide transition between the deformed Tunisian Atlas to the North and the undeformed Saharan platform to the South. It includes three morphostructural alignments, a fold and thrust range in the North, a wide depression in the middle and a monocline to horizontal zone to the south. Four cross-sections have been constructed across the chotts area to illustrate the structure of the Southern Tunisian Atlas Front based on integrated geological and geophysical data including geological maps, petroleum wells, and seismic data. The fold and thrust zone of the northern chotts is interpreted as related to a detachment level near the Triassic-Jurassic contact. The displacement of the basal thrust seems to die out progressively under the Fejej antiform and it is responsible to the south dipping of the southern chotts range. The restoration of the cross-sections indicates that the Southern Tunisian Atlas front is a weakly deformed wide zone developed during the Cenozoic inversion with a maximum calculated shortening in the order of 1000 m. The wide structure of this STAF has been influenced by a pre-existing large thickness of upper Jurassic-Aptian sediments related to the rifting episodes associated to the evolution of Tethys in the Maghreb. During Jurassic to Aptian period, the chotts area corresponded to a highly subsiding basin.Keywords: Southern Tunisian Atlas Front, subsident sub- basin, wide deformation, balanced cross-sections.
Procedia PDF Downloads 147139 Optimizing Super Resolution Generative Adversarial Networks for Resource-Efficient Single-Image Super-Resolution via Knowledge Distillation and Weight Pruning
Authors: Hussain Sajid, Jung-Hun Shin, Kum-Won Cho
Abstract:
Image super-resolution is the most common computer vision problem with many important applications. Generative adversarial networks (GANs) have promoted remarkable advances in single-image super-resolution (SR) by recovering photo-realistic images. However, high memory requirements of GAN-based SR (mainly generators) lead to performance degradation and increased energy consumption, making it difficult to implement it onto resource-constricted devices. To relieve such a problem, In this paper, we introduce an optimized and highly efficient architecture for SR-GAN (generator) model by utilizing model compression techniques such as Knowledge Distillation and pruning, which work together to reduce the storage requirement of the model also increase in their performance. Our method begins with distilling the knowledge from a large pre-trained model to a lightweight model using different loss functions. Then, iterative weight pruning is applied to the distilled model to remove less significant weights based on their magnitude, resulting in a sparser network. Knowledge Distillation reduces the model size by 40%; pruning then reduces it further by 18%. To accelerate the learning process, we employ the Horovod framework for distributed training on a cluster of 2 nodes, each with 8 GPUs, resulting in improved training performance and faster convergence. Experimental results on various benchmarks demonstrate that the proposed compressed model significantly outperforms state-of-the-art methods in terms of peak signal-to-noise ratio (PSNR), structural similarity index measure (SSIM), and image quality for x4 super-resolution tasks.Keywords: single-image super-resolution, generative adversarial networks, knowledge distillation, pruning
Procedia PDF Downloads 95138 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data
Authors: K. Sathishkumar, V. Thiagarasu
Abstract:
Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.Keywords: microarray technology, gene expression data, clustering, gene Selection
Procedia PDF Downloads 323137 Necessary Condition to Utilize Adaptive Control in Wind Turbine Systems to Improve Power System Stability
Authors: Javad Taherahmadi, Mohammad Jafarian, Mohammad Naser Asefi
Abstract:
The global capacity of wind power has dramatically increased in recent years. Therefore, improving the technology of wind turbines to take different advantages of this enormous potential in the power grid, could be interesting subject for scientists. The doubly-fed induction generator (DFIG) wind turbine is a popular system due to its many advantages such as the improved power quality, high energy efficiency and controllability, etc. With an increase in wind power penetration in the network and with regard to the flexible control of wind turbines, the use of wind turbine systems to improve the dynamic stability of power systems has been of significance importance for researchers. Subsynchronous oscillations are one of the important issues in the stability of power systems. Damping subsynchronous oscillations by using wind turbines has been studied in various research efforts, mainly by adding an auxiliary control loop to the control structure of the wind turbine. In most of the studies, this control loop is composed of linear blocks. In this paper, simple adaptive control is used for this purpose. In order to use an adaptive controller, the convergence of the controller should be verified. Since adaptive control parameters tend to optimum values in order to obtain optimum control performance, using this controller will help the wind turbines to have positive contribution in damping the network subsynchronous oscillations at different wind speeds and system operating points. In this paper, the application of simple adaptive control in DFIG wind turbine systems to improve the dynamic stability of power systems is studied and the essential condition for using this controller is considered. It is also shown that this controller has an insignificant effect on the dynamic stability of the wind turbine, itself.Keywords: almost strictly positive real (ASPR), doubly-fed induction generator (DIFG), simple adaptive control (SAC), subsynchronous oscillations, wind turbine
Procedia PDF Downloads 375136 Punishment on top of Punishment - Impact of Inmate Misconduct
Authors: Nazirah Hassan, Andrew Kendrick
Abstract:
Punishment inside the penal institution has always been practiced in order to maintain discipline and keep order. Nonetheless, criminologists have long debated that the enforcement of discipline by punishing inmates is often ineffective and has a detrimental impact on inmates’ conduct. This paper uses data from a sample of 289 incarcerated young offenders to investigate the prevalence of institutional misconduct. It explores punitive cultural practices inside institutions and how this culture affects the inmates’ conduct during confinement. The project focused on male and female young offenders aged 12 to 21 years old, in eight juvenile justice institutions. The research collected quantitative and qualitative data using a mixed-method approach. All participants completed the Direct and Indirect Prisoner behavior Checklist-Scaled Version Revised (DIPC-SCALED-R). In addition, exploratory interviews were carried out with sixteen inmates and eight institutional staff. Results of the questionnaire survey show that almost half of the inmates reported a higher level of involvement in perpetration. It demonstrates a remarkable convergence of direct, rather than indirect, perpetration. Also, inmates reported a higher level of tobacco used and behavior associated with negative attitudes towards staff and institutional rules. In addition to this, the qualitative data suggests that the punitive culture encourages the onset of misconduct by increasing the stressful and oppressive conditions within the institution. In general, physical exercise and locking up inmates are two forms of punishment that were ubiquitous throughout the institutions. Interestingly, physical exercise is not only enforced by institutional staff but also inmates. These findings are discussed in terms of existing literature and their practical implications are considered.Keywords: institutional punishment, incarcerated young offenders, punitive culture, institutional misconduct
Procedia PDF Downloads 240135 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm
Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei
Abstract:
In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes
Procedia PDF Downloads 73134 The Impact of Land Cover Change on Stream Discharges and Water Resources in Luvuvhu River Catchment, Vhembe District, Limpopo Province, South Africa
Authors: P. M. Kundu, L. R. Singo, J. O. Odiyo
Abstract:
Luvuvhu River catchment in South Africa experiences floods resulting from heavy rainfall of intensities exceeding 15 mm per hour associated with the Inter-tropical Convergence Zone (ITCZ). The generation of runoff is triggered by the rainfall intensity and soil moisture status. In this study, remote sensing and GIS techniques were used to analyze the hydrologic response to land cover changes. Runoff was calculated as a product of the net precipitation and a curve number coefficient. It was then routed using the Muskingum-Cunge method using a diffusive wave transfer model that enabled the calculation of response functions between start and end point. Flood frequency analysis was determined using theoretical probability distributions. Spatial data on land cover was obtained from multi-temporal Landsat images while data on rainfall, soil type, runoff and stream discharges was obtained by direct measurements in the field and from the Department of Water. A digital elevation model was generated from contour maps available at http://www.ngi.gov.za. The results showed that land cover changes had impacted negatively to the hydrology of the catchment. Peak discharges in the whole catchment were noted to have increased by at least 17% over the period while flood volumes were noted to have increased by at least 11% over the same period. The flood time to peak indicated a decreasing trend, in the range of 0.5 to 1 hour within the years. The synergism between remotely sensed digital data and GIS for land surface analysis and modeling was realized, and it was therefore concluded that hydrologic modeling has potential for determining the influence of changes in land cover on the hydrologic response of the catchment.Keywords: catchment, digital elevation model, hydrological model, routing, runoff
Procedia PDF Downloads 565133 A Hybrid Block Multistep Method for Direct Numerical Integration of Fourth Order Initial Value Problems
Authors: Adamu S. Salawu, Ibrahim O. Isah
Abstract:
Direct solution to several forms of fourth-order ordinary differential equations is not easily obtained without first reducing them to a system of first-order equations. Thus, numerical methods are being developed with the underlying techniques in the literature, which seeks to approximate some classes of fourth-order initial value problems with admissible error bounds. Multistep methods present a great advantage of the ease of implementation but with a setback of several functions evaluation for every stage of implementation. However, hybrid methods conventionally show a slightly higher order of truncation for any k-step linear multistep method, with the possibility of obtaining solutions at off mesh points within the interval of solution. In the light of the foregoing, we propose the continuous form of a hybrid multistep method with Chebyshev polynomial as a basis function for the numerical integration of fourth-order initial value problems of ordinary differential equations. The basis function is interpolated and collocated at some points on the interval [0, 2] to yield a system of equations, which is solved to obtain the unknowns of the approximating polynomial. The continuous form obtained, its first and second derivatives are evaluated at carefully chosen points to obtain the proposed block method needed to directly approximate fourth-order initial value problems. The method is analyzed for convergence. Implementation of the method is done by conducting numerical experiments on some test problems. The outcome of the implementation of the method suggests that the method performs well on problems with oscillatory or trigonometric terms since the approximations at several points on the solution domain did not deviate too far from the theoretical solutions. The method also shows better performance compared with an existing hybrid method when implemented on a larger interval of solution.Keywords: Chebyshev polynomial, collocation, hybrid multistep method, initial value problems, interpolation
Procedia PDF Downloads 121