Search results for: convergence and smoothness
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 642

Search results for: convergence and smoothness

522 Efficacy Testing of a Product in Reducing Facial Hyperpigmentation and Photoaging after a 12-Week Use

Authors: Nalini Kaul, Barrie Drewitt, Elsie Kohoot

Abstract:

Hyperpigmentation is the third most common pigmentary disorder where dermatologic treatment is sought. It affects all ages resulting in skin darkening because of melanin accumulation. An uneven skin tone because of either exposure to the sun (solar lentigos/age spots/sun spots or skin disruption following acne, or rashes (post-inflammatory hyperpigmentation -PIH) or hormonal changes (melasma) can lead to significant psychosocial impairment. Dyschromia is a result of various alterations in biochemical processes regulating melanogenesis. Treatments include the daily use of sunscreen with lightening, brightening, and exfoliating products. Depigmentation is achieved by various depigmenting agents: common examples are hydroquinone, arbutin, azelaic acid, aloesin, mulberry, licorice extracts, kojic acid, niacinamide, ellagic acid, arbutin, green tea, turmeric, soy, ascorbic acid, and tranexamic acid. These agents affect pigmentation by interfering with mechanisms before, during, and after melanin synthesis. While immediate correction is much sought after, patience and diligence are key. Our objective was to assess the effects of a facial product with pigmentation treatment and UV protection in 35 healthy F (35-65y), meeting the study criteria. Subjects with mild to moderate hyperpigmentation and fine lines with no use of skin-lightening products in the last six months or any dermatological procedures in the last twelve months before the study started were included. Efficacy parameters included expert clinical grading for hyperpigmentation, radiance, skin tone & smoothness, fine lines, and wrinkles bioinstrumentation (Corneometer®, Colorimeter®), digital photography and imaging (Visia-CR®), and self-assessment questionnaires. Safety included grading for erythema, edema, dryness & peeling and self-assessments for itching, stinging, tingling, and burning. Our results showed statistically significant improvement in clinical grading scores, bioinstrumentation, and digital photos for hyperpigmentation-brown spots, fine lines/wrinkles, skin tone, radiance, pores, skin smoothness, and overall appearance compared to baseline. The product was also well-tolerated and liked by subjects. Conclusion: Facial hyperpigmentation is of great concern, and treatment strategies are increasingly sought. Clinical trials with both subjective and objective assessments, imaging analyses, and self-perception are essential to distinguish evidence-based products. The multifunctional cosmetic product tested in this clinical study showed efficacy, tolerability, and subject satisfaction in reducing hyperpigmentation and global photoaging.

Keywords: hyperpigmentation; photoaging, clinical testing, expert visual evaluations, bio-instruments

Procedia PDF Downloads 77
521 The Application of Creative Economy in National R&D Programs of Health Technology (HT) Area in Korea

Authors: Hong Bum Kim

Abstract:

Health technology (HT) area have high growth potential because of global trends such as ageing and economical development. For its high employment effect and capability for creating new business, HT is being considered as one of the major next-generation growth power. Particularly, convergence technologies which are emerged by fusion of HT and other technological area is emphasized for new industry creation in Korea, as a part of Creative Economy. In this study, current status of HT area in Korea is analyzed. The aspect of transition in emphasized technological area of HT-related national R&D enterprise is statistically reviewed. Current level of HT-related technologies such as BT, IT and NT is investigated in this context. Existing research system for HT-convergence technology development such as establishment of research center is also analyzed. Finally, proposed research support system such as system of legislation for developing HT area as one of the main component of Creative Economy in Korea will be analyzed. Analysis of technology trend and policy will help to draw a new direction in progression of R&D enterprise in HT area. Improvement of policy such as legal system reorganization and measure of social agreement for burden of expense could be deduced based on these results.

Keywords: HT, creative economy, policy, national R&D programs

Procedia PDF Downloads 387
520 Smartphone Photography in Urban China

Authors: Wen Zhang

Abstract:

The smartphone plays a significant role in media convergence, and smartphone photography is reconstructing the way we communicate and think. This article aims to explore the smartphone photography practices of urban Chinese smartphone users and images produced by smartphones from a techno-cultural perspective. The analysis consists of two types of data: One is a semi-structured interview of 21 participants, and the other consists of the images created by the participants. The findings are organised in two parts. The first part summarises the current tendencies of capturing, editing, sharing and archiving digital images via smartphones. The second part shows that food and selfie/anti-selfie are the preferred subjects of smartphone photographic images from a technical and multi-purpose perspective and demonstrates that screenshots and image texts are new genres of non-photographic images that are frequently made by smartphones, which contributes to improving operational efficiency, disseminating information and sharing knowledge. The analyses illustrate the positive impacts between smartphones and photography enthusiasm and practices based on the diffusion of innovation theory, which also makes us rethink the value of photographs and the practice of ‘photographic seeing’ from the screen itself.

Keywords: digital photography, image-text, media convergence, photographic- seeing, selfie/anti-selfie, smartphone, technological innovation

Procedia PDF Downloads 354
519 Modelling of Structures by Advanced Finites Elements Based on the Strain Approach

Authors: Sifeddine Abderrahmani, Sonia Bouafia

Abstract:

The finite element method is the most practical tool for the analysis of structures, whatever the geometrical shape and behavior. It is extensively used in many high-tech industries, such as civil or military engineering, for the modeling of bridges, motor bodies, fuselages, and airplane wings. Additionally, experience demonstrates that engineers like modeling their structures using the most basic finite elements. Numerous models of finite elements may be utilized in the numerical analysis depending on the interpolation field that is selected, and it is generally known that convergence to the proper value will occur considerably more quickly with a good displacement pattern than with a poor pattern, saving computation time. The method for creating finite elements using the strain approach (S.B.A.) is presented in this presentation. When the results are compared with those provided by equivalent displacement-based elements, having the same total number of degrees of freedom, an excellent convergence can be obtained through some application and validation tests using recently developed membrane elements, plate bending elements, and flat shell elements. The effectiveness and performance of the strain-based finite elements in modeling structures are proven by the findings for deflections and stresses.

Keywords: finite elements, plate bending, strain approach, displacement formulation, shell element

Procedia PDF Downloads 99
518 Particle Filter State Estimation Algorithm Based on Improved Artificial Bee Colony Algorithm

Authors: Guangyuan Zhao, Nan Huang, Xuesong Han, Xu Huang

Abstract:

In order to solve the problem of sample dilution in the traditional particle filter algorithm and achieve accurate state estimation in a nonlinear system, a particle filter method based on an improved artificial bee colony (ABC) algorithm was proposed. The algorithm simulated the process of bee foraging and optimization and made the high likelihood region of the backward probability of particles moving to improve the rationality of particle distribution. The opposition-based learning (OBL) strategy is introduced to optimize the initial population of the artificial bee colony algorithm. The convergence factor is introduced into the neighborhood search strategy to limit the search range and improve the convergence speed. Finally, the crossover and mutation operations of the genetic algorithm are introduced into the search mechanism of the following bee, which makes the algorithm jump out of the local extreme value quickly and continue to search the global extreme value to improve its optimization ability. The simulation results show that the improved method can improve the estimation accuracy of particle filters, ensure the diversity of particles, and improve the rationality of particle distribution.

Keywords: particle filter, impoverishment, state estimation, artificial bee colony algorithm

Procedia PDF Downloads 152
517 The Role of Metaheuristic Approaches in Engineering Problems

Authors: Ferzat Anka

Abstract:

Many types of problems can be solved using traditional analytical methods. However, these methods take a long time and cause inefficient use of resources. In particular, different approaches may be required in solving complex and global engineering problems that we frequently encounter in real life. The bigger and more complex a problem, the harder it is to solve. Such problems are called Nondeterministic Polynomial time (NP-hard) in the literature. The main reasons for recommending different metaheuristic algorithms for various problems are the use of simple concepts, the use of simple mathematical equations and structures, the use of non-derivative mechanisms, the avoidance of local optima, and their fast convergence. They are also flexible, as they can be applied to different problems without very specific modifications. Thanks to these features, it can be easily embedded even in many hardware devices. Accordingly, this approach can also be used in trend application areas such as IoT, big data, and parallel structures. Indeed, the metaheuristic approaches are algorithms that return near-optimal results for solving large-scale optimization problems. This study is focused on the new metaheuristic method that has been merged with the chaotic approach. It is based on the chaos theorem and helps relevant algorithms to improve the diversity of the population and fast convergence. This approach is based on Chimp Optimization Algorithm (ChOA), that is a recently introduced metaheuristic algorithm inspired by nature. This algorithm identified four types of chimpanzee groups: attacker, barrier, chaser, and driver, and proposed a suitable mathematical model for them based on the various intelligence and sexual motivations of chimpanzees. However, this algorithm is not more successful in the convergence rate and escaping of the local optimum trap in solving high-dimensional problems. Although it and some of its variants use some strategies to overcome these problems, it is observed that it is not sufficient. Therefore, in this study, a newly expanded variant is described. In the algorithm called Ex-ChOA, hybrid models are proposed for position updates of search agents, and a dynamic switching mechanism is provided for transition phases. This flexible structure solves the slow convergence problem of ChOA and improves its accuracy in multidimensional problems. Therefore, it tries to achieve success in solving global, complex, and constrained problems. The main contribution of this study is 1) It improves the accuracy and solves the slow convergence problem of the ChOA. 2) It proposes new hybrid movement strategy models for position updates of search agents. 3) It provides success in solving global, complex, and constrained problems. 4) It provides a dynamic switching mechanism between phases. The performance of the Ex-ChOA algorithm is analyzed on a total of 8 benchmark functions, as well as a total of 2 classical and constrained engineering problems. The proposed algorithm is compared with the ChoA, and several well-known variants (Weighted-ChoA, Enhanced-ChoA) are used. In addition, an Improved algorithm from the Grey Wolf Optimizer (I-GWO) method is chosen for comparison since the working model is similar. The obtained results depict that the proposed algorithm performs better or equivalently to the compared algorithms.

Keywords: optimization, metaheuristic, chimp optimization algorithm, engineering constrained problems

Procedia PDF Downloads 77
516 Federated Knowledge Distillation with Collaborative Model Compression for Privacy-Preserving Distributed Learning

Authors: Shayan Mohajer Hamidi

Abstract:

Federated learning has emerged as a promising approach for distributed model training while preserving data privacy. However, the challenges of communication overhead, limited network resources, and slow convergence hinder its widespread adoption. On the other hand, knowledge distillation has shown great potential in compressing large models into smaller ones without significant loss in performance. In this paper, we propose an innovative framework that combines federated learning and knowledge distillation to address these challenges and enhance the efficiency of distributed learning. Our approach, called Federated Knowledge Distillation (FKD), enables multiple clients in a federated learning setting to collaboratively distill knowledge from a teacher model. By leveraging the collaborative nature of federated learning, FKD aims to improve model compression while maintaining privacy. The proposed framework utilizes a coded teacher model that acts as a reference for distilling knowledge to the client models. To demonstrate the effectiveness of FKD, we conduct extensive experiments on various datasets and models. We compare FKD with baseline federated learning methods and standalone knowledge distillation techniques. The results show that FKD achieves superior model compression, faster convergence, and improved performance compared to traditional federated learning approaches. Furthermore, FKD effectively preserves privacy by ensuring that sensitive data remains on the client devices and only distilled knowledge is shared during the training process. In our experiments, we explore different knowledge transfer methods within the FKD framework, including Fine-Tuning (FT), FitNet, Correlation Congruence (CC), Similarity-Preserving (SP), and Relational Knowledge Distillation (RKD). We analyze the impact of these methods on model compression and convergence speed, shedding light on the trade-offs between size reduction and performance. Moreover, we address the challenges of communication efficiency and network resource utilization in federated learning by leveraging the knowledge distillation process. FKD reduces the amount of data transmitted across the network, minimizing communication overhead and improving resource utilization. This makes FKD particularly suitable for resource-constrained environments such as edge computing and IoT devices. The proposed FKD framework opens up new avenues for collaborative and privacy-preserving distributed learning. By combining the strengths of federated learning and knowledge distillation, it offers an efficient solution for model compression and convergence speed enhancement. Future research can explore further extensions and optimizations of FKD, as well as its applications in domains such as healthcare, finance, and smart cities, where privacy and distributed learning are of paramount importance.

Keywords: federated learning, knowledge distillation, knowledge transfer, deep learning

Procedia PDF Downloads 75
515 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation

Authors: Rizwan Rizwan

Abstract:

This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.

Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats

Procedia PDF Downloads 30
514 The Association of Southeast Asian Nations (ASEAN) and the Dynamics of Resistance to Sovereignty Violation: The Case of East Timor (1975-1999)

Authors: Laura Southgate

Abstract:

The Association of Southeast Asian Nations (ASEAN), as well as much of the scholarship on the organisation, celebrates its ability to uphold the principle of regional autonomy, understood as upholding the norm of non-intervention by external powers in regional affairs. Yet, in practice, this has been repeatedly violated. This dichotomy between rhetoric and practice suggests an interesting avenue for further study. The East Timor crisis (1975-1999) has been selected as a case-study to test the dynamics of ASEAN state resistance to sovereignty violation in two distinct timeframes: Indonesia’s initial invasion of the territory in 1975, and the ensuing humanitarian crisis in 1999 which resulted in a UN-mandated, Australian-led peacekeeping intervention force. These time-periods demonstrate variation on the dependent variable. It is necessary to observe covariation in order to derive observations in support of a causal theory. To establish covariation, my independent variable is therefore a continuous variable characterised by variation in convergence of interest. Change of this variable should change the value of the dependent variable, thus establishing causal direction. This paper investigates the history of ASEAN’s relationship to the norm of non-intervention. It offers an alternative understanding of ASEAN’s history, written in terms of the relationship between a key ASEAN state, which I call a ‘vanguard state’, and selected external powers. This paper will consider when ASEAN resistance to sovereignty violation has succeeded, and when it has failed. It will contend that variation in outcomes associated with vanguard state resistance to sovereignty violation can be best explained by levels of interest convergence between the ASEAN vanguard state and designated external actors. Evidence will be provided to support the hypothesis that in 1999, ASEAN’s failure to resist violations to the sovereignty of Indonesia was a consequence of low interest convergence between Indonesia and the external powers. Conversely, in 1975, ASEAN’s ability to resist violations to the sovereignty of Indonesia was a consequence of high interest convergence between Indonesia and the external powers. As the vanguard state, Indonesia was able to apply pressure on the ASEAN states and obtain unanimous support for Indonesia’s East Timor policy in 1975 and 1999. However, the key factor explaining the variance in outcomes in both time periods resides in the critical role played by external actors. This view represents a serious challenge to much of the existing scholarship that emphasises ASEAN’s ability to defend regional autonomy. As these cases attempt to show, ASEAN autonomy is much more contingent than portrayed in the existing literature.

Keywords: ASEAN, east timor, intervention, sovereignty

Procedia PDF Downloads 358
513 Design of 3-Step Skew BLAC Motor for Better Performance in Electric Power Steering System

Authors: Subrato Saha, Yun-Hyun Cho

Abstract:

In electric power steering (EPS), spoke type brushless ac (BLAC) motors offer distinct advantages over other electric motor types in terms torque smoothness, reliability and efficiency. This paper deals with the shape optimization of spoke type BLAC motor, in order to reduce cogging torque. This paper examines 3 steps skewing rotor angle, optimizing rotor core edge and rotor overlap length for reducing cogging torque in spoke type BLAC motor. The methods were applied to existing machine designs and their performance was calculated using finite- element analysis (FEA). Prototypes of the machine designs were constructed and experimental results obtained. It is shown that the FEA predicted the cogging torque to be nearly reduce using those methods.

Keywords: EPS, 3-Step skewing, spoke type BLAC, cogging torque, FEA, optimization

Procedia PDF Downloads 491
512 Upgraded Cuckoo Search Algorithm to Solve Optimisation Problems Using Gaussian Selection Operator and Neighbour Strategy Approach

Authors: Mukesh Kumar Shah, Tushar Gupta

Abstract:

An Upgraded Cuckoo Search Algorithm is proposed here to solve optimization problems based on the improvements made in the earlier versions of Cuckoo Search Algorithm. Short comings of the earlier versions like slow convergence, trap in local optima improved in the proposed version by random initialization of solution by suggesting an Improved Lambda Iteration Relaxation method, Random Gaussian Distribution Walk to improve local search and further proposing Greedy Selection to accelerate to optimized solution quickly and by “Study Nearby Strategy” to improve global search performance by avoiding trapping to local optima. It is further proposed to generate better solution by Crossover Operation. The proposed strategy used in algorithm shows superiority in terms of high convergence speed over several classical algorithms. Three standard algorithms were tested on a 6-generator standard test system and the results are presented which clearly demonstrate its superiority over other established algorithms. The algorithm is also capable of handling higher unit systems.

Keywords: economic dispatch, gaussian selection operator, prohibited operating zones, ramp rate limits

Procedia PDF Downloads 130
511 A Combined High Gain-Higher Order Sliding Mode Controller for a Class of Uncertain Nonlinear Systems

Authors: Abderraouf Gaaloul, Faouzi Msahli

Abstract:

The use of standard sliding mode controller, usually, leads to the appearing of an undesirable chattering phenomenon affecting the control signal. Such problem can be overcome using a higher-order sliding mode controller (HOSMC) which preserves the main properties of the standard sliding mode and deliberately increases the control smoothness. In this paper, we propose a new HOSMC for a class of uncertain multi-input multi-output nonlinear systems. Based on high gain and integral sliding mode paradigms, the established control scheme removes theoretically the chattering phenomenon and provides the stability of the control system. Numerical simulations are developed to show the effectiveness of the proposed controller when applied to solve a control problem of two water levels into a quadruple-tank process.

Keywords: nonlinear systems, sliding mode control, high gain, higher order

Procedia PDF Downloads 327
510 A New Approach for Solving Fractional Coupled Pdes

Authors: Prashant Pandey

Abstract:

In the present article, an effective Laguerre collocation method is used to obtain the approximate solution of a system of coupled fractional-order non-linear reaction-advection-diffusion equation with prescribed initial and boundary conditions. In the proposed scheme, Laguerre polynomials are used together with an operational matrix and collocation method to obtain approximate solutions of the coupled system, so that our proposed model is converted into a system of algebraic equations which can be solved employing the Newton method. The solution profiles of the coupled system are presented graphically for different particular cases. The salient feature of the present article is finding the stability analysis of the proposed method and also the demonstration of the lower variation of solute concentrations with respect to the column length in the fractional-order system compared to the integer-order system. To show the higher efficiency, reliability, and accuracy of the proposed scheme, a comparison between the numerical results of Burger’s coupled system and its existing analytical result is reported. There are high compatibility and consistency between the approximate solution and its exact solution to a higher order of accuracy. The exhibition of error analysis for each case through tables and graphs confirms the super-linearly convergence rate of the proposed method.

Keywords: fractional coupled PDE, stability and convergence analysis, diffusion equation, Laguerre polynomials, spectral method

Procedia PDF Downloads 145
509 A Study of Using Multiple Subproblems in Dantzig-Wolfe Decomposition of Linear Programming

Authors: William Chung

Abstract:

This paper is to study the use of multiple subproblems in Dantzig-Wolfe decomposition of linear programming (DW-LP). Traditionally, the decomposed LP consists of one LP master problem and one LP subproblem. The master problem and the subproblem is solved alternatively by exchanging the dual prices of the master problem and the proposals of the subproblem until the LP is solved. It is well known that convergence is slow with a long tail of near-optimal solutions (asymptotic convergence). Hence, the performance of DW-LP highly depends upon the number of decomposition steps. If the decomposition steps can be greatly reduced, the performance of DW-LP can be improved significantly. To reduce the number of decomposition steps, one of the methods is to increase the number of proposals from the subproblem to the master problem. To do so, we propose to add a quadratic approximation function to the LP subproblem in order to develop a set of approximate-LP subproblems (multiple subproblems). Consequently, in each decomposition step, multiple subproblems are solved for providing multiple proposals to the master problem. The number of decomposition steps can be reduced greatly. Note that each approximate-LP subproblem is nonlinear programming, and solving the LP subproblem must faster than solving the nonlinear multiple subproblems. Hence, using multiple subproblems in DW-LP is the tradeoff between the number of approximate-LP subproblems being formed and the decomposition steps. In this paper, we derive the corresponding algorithms and provide some simple computational results. Some properties of the resulting algorithms are also given.

Keywords: approximate subproblem, Dantzig-Wolfe decomposition, large-scale models, multiple subproblems

Procedia PDF Downloads 166
508 Transition Dynamic Analysis of the Urban Disparity in Iran “Case Study: Iran Provinces Center”

Authors: Marzieh Ahmadi, Ruhullah Alikhan Gorgani

Abstract:

The usual methods of measuring regional inequalities can not reflect the internal changes of the country in terms of their displacement in different development groups, and the indicators of inequalities are not effective in demonstrating the dynamics of the distribution of inequality. For this purpose, this paper examines the dynamics of the urban inertial transport in the country during the period of 2006-2016 using the CIRD multidimensional index and stochastic kernel density method. it firstly selects 25 indicators in five dimensions including macroeconomic conditions, science and innovation, environmental sustainability, human capital and public facilities, and two-stage Principal Component Analysis methodology are developed to create a composite index of inequality. Then, in the second stage, using a nonparametric analytical approach to internal distribution dynamics and a stochastic kernel density method, the convergence hypothesis of the CIRD index of the Iranian provinces center is tested, and then, based on the ergodic density, long-run equilibrium is shown. Also, at this stage, for the purpose of adopting accurate regional policies, the distribution dynamics and process of convergence or divergence of the Iranian provinces for each of the five. According to the results of the first Stage, in 2006 & 2016, the highest level of development is related to Tehran and zahedan is at the lowest level of development. The results show that the central cities of the country are at the highest level of development due to the effects of Tehran's knowledge spillover and the country's lower cities are at the lowest level of development. The main reason for this may be the lack of access to markets in the border provinces. Based on the results of the second stage, which examines the dynamics of regional inequality transmission in the country during 2006-2016, the first year (2006) is not multifaceted and according to the kernel density graph, the CIRD index of about 70% of the cities. The value is between -1.1 and -0.1. The rest of the sequence on the right is distributed at a level higher than -0.1. In the kernel distribution, a convergence process is observed and the graph points to a single peak. Tends to be a small peak at about 3 but the main peak at about-0.6. According to the chart in the final year (2016), the multidimensional pattern remains and there is no mobility in the lower level groups, but at the higher level, the CIRD index accounts for about 45% of the provinces at about -0.4 Take it. That this year clearly faces the twin density pattern, which indicates that the cities tend to be closely related to each other in terms of development, so that the cities are low in terms of development. Also, according to the distribution dynamics results, the provinces of Iran follow the single-density density pattern in 2006 and the double-peak density pattern in 2016 at low and moderate inequality index levels and also in the development index. The country diverges during the years 2006 to 2016.

Keywords: Urban Disparity, CIRD Index, Convergence, Distribution Dynamics, Random Kernel Density

Procedia PDF Downloads 124
507 Implementation of a Lattice Boltzmann Method for Pulsatile Flow with Moment Based Boundary Condition

Authors: Zainab A. Bu Sinnah, David I. Graham

Abstract:

The Lattice Boltzmann Method has been developed and used to simulate both steady and unsteady fluid flow problems such as turbulent flows, multiphase flow and flows in the vascular system. As an example, the study of blood flow and its properties can give a greater understanding of atherosclerosis and the flow parameters which influence this phenomenon. The blood flow in the vascular system is driven by a pulsating pressure gradient which is produced by the heart. As a very simple model of this, we simulate plane channel flow under periodic forcing. This pulsatile flow is essentially the standard Poiseuille flow except that the flow is driven by the periodic forcing term. Moment boundary conditions, where various moments of the particle distribution function are specified, are applied at solid walls. We used a second-order single relaxation time model and investigated grid convergence using two distinct approaches. In the first approach, we fixed both Reynolds and Womersley numbers and varied relaxation time with grid size. In the second approach, we fixed the Womersley number and relaxation time. The expected second-order convergence was obtained for the second approach. For the first approach, however, the numerical method converged, but not necessarily to the appropriate analytical result. An explanation is given for these observations.

Keywords: Lattice Boltzmann method, single relaxation time, pulsatile flow, moment based boundary condition

Procedia PDF Downloads 231
506 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: generalized extreme values, likelihood estimation, precipitation data, Wakeby distribution

Procedia PDF Downloads 139
505 A Convergent Interacting Particle Method for Computing Kpp Front Speeds in Random Flows

Authors: Tan Zhang, Zhongjian Wang, Jack Xin, Zhiwen Zhang

Abstract:

We aim to efficiently compute the spreading speeds of reaction-diffusion-advection (RDA) fronts in divergence-free random flows under the Kolmogorov-Petrovsky-Piskunov (KPP) nonlinearity. We study a stochastic interacting particle method (IPM) for the reduced principal eigenvalue (Lyapunov exponent) problem of an associated linear advection-diffusion operator with spatially random coefficients. The Fourier representation of the random advection field and the Feynman-Kac (FK) formula of the principal eigenvalue (Lyapunov exponent) form the foundation of our method implemented as a genetic evolution algorithm. The particles undergo advection-diffusion and mutation/selection through a fitness function originated in the FK semigroup. We analyze the convergence of the algorithm based on operator splitting and present numerical results on representative flows such as 2D cellular flow and 3D Arnold-Beltrami-Childress (ABC) flow under random perturbations. The 2D examples serve as a consistency check with semi-Lagrangian computation. The 3D results demonstrate that IPM, being mesh-free and self-adaptive, is simple to implement and efficient for computing front spreading speeds in the advection-dominated regime for high-dimensional random flows on unbounded domains where no truncation is needed.

Keywords: KPP front speeds, random flows, Feynman-Kac semigroups, interacting particle method, convergence analysis

Procedia PDF Downloads 46
504 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization

Authors: Hebberly Ahatlan

Abstract:

The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, IT/OT convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.

Keywords: digitalization, IT/OT convergence, semantic interoperability, VPP, energy blockchain

Procedia PDF Downloads 183
503 The Influence of the Form of Grain on the Mechanical Behaviour of Sand

Authors: Mohamed Boualem Salah

Abstract:

The size and shape of soil particles reflect the formation history of the grains. In turn, the macro scale behavior of the soil mass results from particle level interactions which are affected by particle shape. Sphericity, roundness and smoothness characterize different scales associated to particle shape. New experimental data and data from previously published studies are gathered into two databases to explore the effects of particle shape on packing as well as small and large-strain properties of sandy soils. Data analysis shows that increased particle irregularity (angularity and/or eccentricity) leads to: an increase in emax and emin, a decrease in stiffness yet with increased sensitivity to the state of stress, an increase in compressibility under zero-lateral strain loading, and an increase in critical state friction angle φcs and intercept Γ with a weak effect on slope λ. Therefore, particle shape emerges as a significant soil index property that needs to be properly characterized and documented, particularly in clean sands and gravels. The systematic assessment of particle shape will lead to a better understanding of sand behavior.

Keywords: angularity, eccentricity, shape particle, behavior of soil

Procedia PDF Downloads 414
502 Deposition of Diamond Like Carbon Thin Film by Pulse Laser Deposition for Surgical Instruments

Authors: M. Khalid Alamgir, Javed Ahsan Bhatti, M. Zafarullah Khan

Abstract:

Thin film of amorphous carbon (DLC) was deposited on 316 steel using Nd: YAG laser having energy 300mJ. Pure graphite was used as a target. The vacuum in the deposition chamber was generated in the range of 10-6 mbar by turbo molecular pump. Ratio of sp3 to sp2 content shows amorphous nature of the film. This was confirmed by Raman spectra having two peaks around 1300 cm-1 i.e. D-band to 1700 cm-1 i.e. G-band. If sp3 bonding ratio is high, the films behave like diamond-like whereas, with high sp2, films are graphite-like. The ratio of sp3 and sp2 contents in the film depends upon the deposition method, hydrogen contents and system parameters. The structural study of the film was carried out by XRD. The hardness of the films as measured by Vickers hardness tester and was found to be 28 GPa. The EDX result shows the presence of carbon contents on the surface in high rate and optical microscopy result shows the smoothness of the film on substrate. The film possesses good adhesion and can be used to coat surgical instruments.

Keywords: DLC, thin film, Raman spectroscopy, XRD, EDX

Procedia PDF Downloads 564
501 A Uniformly Convergent Numerical Scheme for a Singularly Perturbed Volterra Integrodifferential Equation

Authors: Nana Adjoah Mbroh, Suares Clovis Oukouomi Noutchie

Abstract:

Singularly perturbed problems are parameter dependent problems, and they play major roles in the modelling of real-life situational problems in applied sciences. Thus, designing efficient numerical schemes to solve these problems is of much interest since the exact solutions of such problems may not even exist. Generally, singularly perturbed problems are identified by a small parameter multiplying at least the highest derivative in the equation. The presence of this parameter causes the solution of these problems to be characterized by rapid oscillations. This unique feature renders classical numerical schemes inefficient since they are unable to capture the behaviour of the exact solution in the part of the domain where the rapid oscillations are present. In this paper, a numerical scheme is proposed to solve a singularly perturbed Volterra Integro-differential equation. The scheme is based on the midpoint rule and employs the non-standard finite difference scheme to solve the differential part whilst the composite trapezoidal rule is used for the integral part. A fully fledged error estimate is performed, and Richardson extrapolation is applied to accelerate the convergence of the scheme. Numerical simulations are conducted to confirm the theoretical findings before and after extrapolation.

Keywords: midpoint rule, non-standard finite difference schemes, Richardson extrapolation, singularly perturbed problems, trapezoidal rule, uniform convergence

Procedia PDF Downloads 125
500 Predictive Output Feedback Linearization for Safe Control of Collaborative Robots

Authors: Aliasghar Arab

Abstract:

Autonomous robots interacting with humans, as safety-critical nonlinear control systems, are complex closed-loop cyber-physical dynamical machines. Keeping these intelligent yet complicated systems safe and smooth during their operations is challenging. The aim of the safe predictive output feedback linearization control synthesis is to design a novel controller for smooth trajectory following while unsafe situations must be avoided. The controller design should obtain a linearized output for smoothness and invariance to a safety subset. Inspired by finite-horizon nonlinear model predictive control, the problem is formulated as constrained nonlinear dynamic programming. The safety constraints can be defined as control barrier functions. Avoiding unsafe maneuvers and performing smooth motions increases the predictability of the robot’s movement for humans when robots and people are working together. Our results demonstrate the proposed output linearization method obeys the safety constraints and, compared to existing safety-guaranteed methods, is smoother and performs better.

Keywords: robotics, collaborative robots, safety, autonomous robots

Procedia PDF Downloads 97
499 Hybrid Intelligent Optimization Methods for Optimal Design of Horizontal-Axis Wind Turbine Blades

Authors: E. Tandis, E. Assareh

Abstract:

Designing the optimal shape of MW wind turbine blades is provided in a number of cases through evolutionary algorithms associated with mathematical modeling (Blade Element Momentum Theory). Evolutionary algorithms, among the optimization methods, enjoy many advantages, particularly in stability. However, they usually need a large number of function evaluations. Since there are a large number of local extremes, the optimization method has to find the global extreme accurately. The present paper introduces a new population-based hybrid algorithm called Genetic-Based Bees Algorithm (GBBA). This algorithm is meant to design the optimal shape for MW wind turbine blades. The current method employs crossover and neighborhood searching operators taken from the respective Genetic Algorithm (GA) and Bees Algorithm (BA) to provide a method with good performance in accuracy and speed convergence. Different blade designs, twenty-one to be exact, were considered based on the chord length, twist angle and tip speed ratio using GA results. They were compared with BA and GBBA optimum design results targeting the power coefficient and solidity. The results suggest that the final shape, obtained by the proposed hybrid algorithm, performs better compared to either BA or GA. Furthermore, the accuracy and speed convergence increases when the GBBA is employed

Keywords: Blade Design, Optimization, Genetic Algorithm, Bees Algorithm, Genetic-Based Bees Algorithm, Large Wind Turbine

Procedia PDF Downloads 316
498 Convergence of Media in New Era

Authors: Mohamad Reza Asariha

Abstract:

The development and extension of modern communication innovations at an extraordinary speed has caused crucial changes in all financial, social, social and political areas of the world. The improvement of toady and cable innovations, in expansion to expanding the generation and dissemination needs of worldwide programs; the financial defense made it more appealing. The alter of the administration of mechanical economy to data economy and benefit economy in created nations brought approximately uncommon advancements within the standards of world exchange and as a result, it caused the extension of media organizations in outside measurements, and the advancement of financial speculations in many Asian nations, beside the worldwide demand for the utilization of media merchandise, made new markets, and the media both within the household scene of the nations and within the universal field. Universal and financial are of great significance and have and viable and compelling nearness within the condition of picking up, keeping up and expanding financial control and riches within the world. Moreover, mechanical progresses and mechanical joining are critical components in media auxiliary alter. This auxiliary alter took put beneath the impact of digitalization. That’s, the method that broke the boundaries between electronic media administrations. Until presently, the direction of mass media was totally subordinate on certain styles of data transmission that were for the most part utilized. Digitization made it conceivable for any content to be effortlessly transmitted through distinctive electronic transmission styles, and this media merging has had clear impacts on media approaches and the way mass media are controlled.

Keywords: media, digital era, digital ages, media convergence

Procedia PDF Downloads 74
497 The Impact of Vertical Velocity Parameter Conditions and Its Relationship with Weather Parameters in the Hail Event

Authors: Nadine Ayasha

Abstract:

Hail happened in Sukabumi (August 23, 2020), Sekadau (August 22, 2020), and Bogor (September 23, 2020), where this extreme weather phenomenon occurred in the dry season. This study uses the ERA5 reanalysis model data, it aims to examine the vertical velocity impact on the hail occurrence in the dry season, as well as its relation to other weather parameters such as relative humidity, streamline, and wind velocity. Moreover, HCAI product satellite data is used as supporting data for the convective cloud development analysis. Based on the results of graphs, contours, and Hovmoller vertical cut from ERA5 modeling, the vertical velocity values in the 925 Mb-300 Mb layer in Sukabumi, Sekadau, and Bogor before the hail event ranged between -1.2-(-0.2), -1.5-(-0.2), -1-0 Pa/s. A negative value indicates that there is an upward motion from the air mass that trigger the convective cloud growth, which produces hail. It is evidenced by the presence of Cumulonimbus cloud on HCAI product when the hail falls. Therefore, the vertical velocity has significant effect on the hail event. In addition, the relative humidity in the 850-700 Mb layer is quite wet, which ranges from 80-90%. Meanwhile, the streamline and wind velocity in the three regions show the convergence with slowing wind velocity ranging from 2-4 knots. These results show that the upward motion of the vertical velocity is enough to form the wet atmospheric humidity and form a convergence for the growth of the convective cloud, which produce hail in the dry season.

Keywords: hail, extreme weather, vertical velocity, relative humidity, streamline

Procedia PDF Downloads 159
496 Relation between Pavement Roughness and Distress Parameters for Highways

Authors: Suryapeta Harini

Abstract:

Road surface roughness is one of the essential aspects of the road's functional condition, indicating riding comfort in both the transverse and longitudinal directions. The government of India has made maintaining good surface evenness a prerequisite for all highway projects. Pavement distress data was collected with a Network Survey Vehicle (NSV) on a National Highway. It determines the smoothness and frictional qualities of the pavement surface, which are related to driving safety and ease. Based on the data obtained in the field, a regression equation was created with the IRI value and the visual distresses. The suggested system can use wireless acceleration sensors and GPS to gather vehicle status and location data, as well as calculate the international roughness index (IRI). Potholes, raveling, rut depth, cracked area, and repair work are all affected by pavement roughness, according to the current study. The study was carried out in one location. Data collected through using Bump integrator was used for the validation. The bump integrator (BI) obtained using deflection from the network survey vehicle was correlated with the distress parameter to establish an equation.

Keywords: roughness index, network survey vehicle, regression, correlation

Procedia PDF Downloads 176
495 Performance Analysis and Multi-Objective Optimization of a Kalina Cycle for Low-Temperature Applications

Authors: Sadegh Sadeghi, Negar Shabani

Abstract:

From a thermal point of view, zeotropic mixtures are likely to be more efficient than azeotropic fluids in low-temperature thermodynamic cycles due to their suitable boiling characteristics. In this study, performance of a low-temperature Kalina cycle with R717/water working fluid used in different existing power plants is mathematically investigated. To analyze the behavior of the cycle, mass conservation, energy conservation, and exergy balance equations are presented. With regard to the similarity in molar mass of R717 (17.03 gr/mol) and water (18.01 gr/mol), there is no need to alter the size of Kalina system components such as turbine and pump. To optimize the cycle energy and exergy efficiencies simultaneously, a constrained multi-objective optimization is carried out applying an Artificial Bee Colony algorithm. The main motivation behind using this algorithm lies on its robustness, reliability, remarkable precision and high–speed convergence rate in dealing with complicated constrained multi-objective problems. Convergence rates of the algorithm for calculating the optimal energy and exergy efficiencies are presented. Subsequently, due to the importance of exergy concept in Kalina cycles, exergy destructions occurring in the components are computed. Finally, the impacts of pressure, temperature, mass fraction and mass flow rate on the energy and exergy efficiencies are elaborately studied.

Keywords: artificial bee colony algorithm, binary zeotropic mixture, constrained multi-objective optimization, energy efficiency, exergy efficiency, Kalina cycle

Procedia PDF Downloads 153
494 Confluence of Relations: An Auto-Ethnographic Account of Field Recording in the Anthropocene Age

Authors: Freya Zinovieff

Abstract:

In the age of the Anthropocene, all ecosystems, no matter how remote, is influenced by the relations between humans and technology. These influences are evidenced by current extinction rates, changes in species diversity, and species adaptation to pollution. Field recording is a tool through which we are able to document the extent to which life forms associated with the place are entangled with human-technology relationships. This paper documents the convergence of interaction between technologies, species, and landscape via an auto-ethnographic account of a field recording taken from a cell phone tower in Bali, Indonesia. In the recording, we hear a confluence of relations where critter and technology meet. The electrical hum of the tower merges with frogs and the amaranthine throb of crickets, in such a way that it is hard to tell where technology begins and the voice of creatures ends. The outcomes of this venture resulted in a framework for evaluating the sensorial relations within field recording. The framework calls for the soundscape to be understood as a multilayered ontology through which there is a convergence of multispecies relationships, or entanglements, across time and geographic location. These entanglements are not necessarily obvious. Sometimes quiet, sometimes elusive, sometimes only audible through the mediated conduit of digital technology. The paper argues that to be aware of these entanglements is to open ourselves to a type of beauty that is firmly rooted in the present paradigm of extinction and loss. By virtue of this understanding, we are bestowed with an opportunity to embrace the grave reality of the current sixth mass extinction and move forwards with what activist Joanna Macy calls the compassionate action.

Keywords: anthropocene, human-technology relationships, multispecies ethnography, field recording

Procedia PDF Downloads 150
493 Hypercomplex Dynamics and Turbulent Flows in Sobolev and Besov Functional Spaces

Authors: Romulo Damasclin Chaves dos Santos, Jorge Henrique de Oliveira Sales

Abstract:

This paper presents a rigorous study of advanced functional spaces, with a focus on Sobolev and Besov spaces, to investigate key aspects of fluid dynamics, including the regularity of solutions to the Navier-Stokes equations, hypercomplex bifurcations, and turbulence. We offer a comprehensive analysis of Sobolev embedding theorems in fractional spaces and apply bifurcation theory within quaternionic dynamical systems to better understand the complex behaviors in fluid systems. Additionally, the research delves into energy dissipation mechanisms in turbulent flows through the framework of Besov spaces. Key mathematical tools, such as interpolation theory, Littlewood-Paley decomposition, and energy cascade models, are integrated to develop a robust theoretical approach to these problems. By addressing challenges related to the existence and smoothness of solutions, this work contributes to the ongoing exploration of the open Navier-Stokes problem, providing new insights into the intricate relationship between fluid dynamics and functional spaces.

Keywords: navier-stokes equations, hypercomplex bifurcations, turbulence, sobolev and besov space

Procedia PDF Downloads 14