Search results for: exponential matrix method
17008 Zeolite 4A-confined Ni-Co Nanocluster: An Efficient and Durable Electrocatalyst for Alkaline Methanol Oxidation Reaction
Authors: Sarmistha Baruah, Akshai Kumar, Nageswara Rao Peela
Abstract:
The global energy crisis due to the dependence on fossil fuels and its limited reserves as well as environmental pollution are key concerns to the research communities. However, the implementation of alcohol-based fuel cells such as methanol is anticipated as a reliable source of future energy technology due to their high energy density, environment friendliness, ease of storage, transportation, etc. To drive the anodic methanol oxidation reaction (MOR) in direct methanol fuel cells (DMFCs), an active and long-lasting catalyst is necessary for efficient energy conversion from methanol. Recently, transition metal-zeolite-based materials have been considered versatile catalysts for a variety of industrial and lab-scale processes. Large specific surface area, well-organized micropores, and adjustable acidity/basicity are characteristics of zeolites that make them excellent supports for immobilizing small-sized and highly dispersed metal species. Significant advancement in the production and characterization of well-defined metal clusters encapsulated within zeolite matrix has substantially expanded the library of materials available, and consequently, their catalytic efficacy. In this context, we developed bimetallic Ni-Co catalysts encapsulated within LTA (also known as 4A) zeolite via a method combined with the in-situ encapsulation of metal species using hydrothermal treatment followed by a chemical reduction process. The prepared catalyst was characterized using advanced characterization techniques, such as X-ray diffraction (XRD), field emission transmission electron microscope (FETEM), field emission scanning electron microscope (FESEM), energy dispersive X-ray (EDX), and X-ray photoelectron spectroscopy (XPS). The electrocatalytic activity of the catalyst for MOR was carried out in an alkaline medium at room temperature using techniques such as cyclic voltammetry (CV), and chronoamperometry (CA). The resulting catalyst exhibited better catalytic activity of 12.1 mA cm-2 at 1.12 V vs Ag/AgCl and retained remarkable stability (~77%) even after 1000 cycles CV test for the electro-oxidation of methanol in alkaline media without any significant microstructural changes. The high surface area, better Ni-Co species integration in the zeolite, and the ample amount of surface hydroxyl groups contribute to highly dispersed active sites and quick analyte diffusion, which provide notable MOR kinetics. Thus, this study will open up new possibilities to develop a noble metal-free zeolite-based electrocatalyst due to its simple synthesis steps, large-scale fabrication, improved stability, and efficient activity for DMFC application.Keywords: alkaline media, bimetallic, encapsulation, methanol oxidation reaction, LTA zeolite.
Procedia PDF Downloads 6517007 Image Analysis for Obturator Foramen Based on Marker-controlled Watershed Segmentation and Zernike Moments
Authors: Seda Sahin, Emin Akata
Abstract:
Obturator foramen is a specific structure in pelvic bone images and recognition of it is a new concept in medical image processing. Moreover, segmentation of bone structures such as obturator foramen plays an essential role for clinical research in orthopedics. In this paper, we present a novel method to analyze the similarity between the substructures of the imaged region and a hand drawn template, on hip radiographs to detect obturator foramen accurately with integrated usage of Marker-controlled Watershed segmentation and Zernike moment feature descriptor. Marker-controlled Watershed segmentation is applied to seperate obturator foramen from the background effectively. Zernike moment feature descriptor is used to provide matching between binary template image and the segmented binary image for obturator foramens for final extraction. The proposed method is tested on randomly selected 100 hip radiographs. The experimental results represent that our method is able to segment obturator foramens with % 96 accuracy.Keywords: medical image analysis, segmentation of bone structures on hip radiographs, marker-controlled watershed segmentation, zernike moment feature descriptor
Procedia PDF Downloads 43417006 Whole Body Cooling Hypothermia Treatment Modelling Using a Finite Element Thermoregulation Model
Authors: Ana Beatriz C. G. Silva, Luiz Carlos Wrobel, Fernando Luiz B. Ribeiro
Abstract:
This paper presents a thermoregulation model using the finite element method to perform numerical analyses of brain cooling procedures as a contribution to the investigation on the use of therapeutic hypothermia after ischemia in adults. The use of computational methods can aid clinicians to observe body temperature using different cooling methods without the need of invasive techniques, and can thus be a valuable tool to assist clinical trials simulating different cooling options that can be used for treatment. In this work, we developed a FEM package applied to the solution of the continuum bioheat Pennes equation. Blood temperature changes were considered using a blood pool approach and a lumped analysis for intravascular catheter method of blood cooling. Some analyses are performed using a three-dimensional mesh based on a complex geometry obtained from computed tomography medical images, considering a cooling blanket and a intravascular catheter. A comparison is made between the results obtained and the effects of each case in brain temperature reduction in a required time, maintenance of body temperature at moderate hypothermia levels and gradual rewarming.Keywords: brain cooling, finite element method, hypothermia treatment, thermoregulation
Procedia PDF Downloads 31117005 K-Means Based Matching Algorithm for Multi-Resolution Feature Descriptors
Authors: Shao-Tzu Huang, Chen-Chien Hsu, Wei-Yen Wang
Abstract:
Matching high dimensional features between images is computationally expensive for exhaustive search approaches in computer vision. Although the dimension of the feature can be degraded by simplifying the prior knowledge of homography, matching accuracy may degrade as a tradeoff. In this paper, we present a feature matching method based on k-means algorithm that reduces the matching cost and matches the features between images instead of using a simplified geometric assumption. Experimental results show that the proposed method outperforms the previous linear exhaustive search approaches in terms of the inlier ratio of matched pairs.Keywords: feature matching, k-means clustering, SIFT, RANSAC
Procedia PDF Downloads 35717004 Short-Term Load Forecasting Based on Variational Mode Decomposition and Least Square Support Vector Machine
Authors: Jiangyong Liu, Xiangxiang Xu, Bote Luo, Xiaoxue Luo, Jiang Zhu, Lingzhi Yi
Abstract:
To address the problems of non-linearity and high randomness of the original power load sequence causing the degradation of power load forecasting accuracy, a short-term load forecasting method is proposed. The method is based on the Least Square Support Vector Machine optimized by an Improved Sparrow Search Algorithm combined with the Variational Mode Decomposition proposed in this paper. The application of the variational mode decomposition technique decomposes the raw power load data into a series of Intrinsic Mode Functions components, which can reduce the complexity and instability of the raw data while overcoming modal confounding; the proposed improved sparrow search algorithm can solve the problem of difficult selection of learning parameters in the least Square Support Vector Machine. Finally, through comparison experiments, the results show that the method can effectively improve prediction accuracy.Keywords: load forecasting, variational mode decomposition, improved sparrow search algorithm, least square support vector machine
Procedia PDF Downloads 10817003 Analysis Influence Variation Frequency on Characterization of Nano-Particles in Preteatment Bioetanol Oil Palm Stem (Elaeis guineensis JACQ) Use Sonication Method with Alkaline Peroxide Activators on Improvement of Celullose
Authors: Luristya Nur Mahfut, Nada Mawarda Rilek, Ameiga Cautsarina Putri, Mujaroh Khotimah
Abstract:
The use of bioetanol from lignocellulosic material has begone to be developed. In Indonesia the most abundant lignocellulosic material is stem of palm which contain 32.22% of cellulose. Indonesia produces approximatelly 300.375.000 tons of stem of palm each year. To produce bioetanol from lignocellulosic material, the first process is pretreatment. But, until now the method of lignocellulosic pretretament is uneffective. This is related to the particle size and the method of pretreatment of less than optimal so that led to an overhaul of the lignin insufficient, consequently increased levels of cellulose was not significant resulting in low yield of bioetanol. To solve the problem, this research was implemented by using the process of pretreatment method ultasonifikasi in order to produce higher pulp with nano-sized particles that will obtain higher of yield ethanol from stem of palm. Research methods used in this research is the RAK that is composed of one factor which is the frequency ultrasonic waves with three varians, they are 30 kHz, 40 kHz, 50 kHz, and use constant variable is concentration of NaOH. The analysis conducted in this research is the influence of the frequency of the wave to increase levels of cellulose and change size on the scale of nanometers on pretreatment process by using the PSA methods (Particle Size Analyzer), and a Cheason. For the analysis of the results, data, and best treatment using ANOVA and test BNT with confidence interval 5%. The best treatment was obtained by combination X3 (frequency of sonication 50 kHz) and lignin (19,6%) cellulose (59,49%) and hemicellulose (11,8%) with particle size 385,2nm (18,8%).Keywords: bioethanol, pretreatment, stem of palm, cellulosa
Procedia PDF Downloads 32717002 Matlab Method for Exclusive-or Nodes in Fuzzy GERT Networks
Authors: Roland Lachmayer, Mahtab Afsari
Abstract:
Research is the cornerstone for advancement of human communities. So that it is one of the indexes for evaluating advancement of countries. Research projects are usually cost and time-consuming and do not end in result in short term. Project scheduling is one of the integral parts of project management. The present article offers a new method by using C# and Matlab software to solve Fuzzy GERT networks for Exclusive-OR kind of nodes to schedule the network. In this article we concentrate on flowcharts that we used in Matlab to show how we apply Matlab to schedule Exclusive-OR nodes.Keywords: research projects, fuzzy GERT, fuzzy CPM, CPM, α-cuts, scheduling
Procedia PDF Downloads 39817001 Comparative Study of the Distribution of Seismic Loads of Buildings with Asymmetries Plan
Authors: Ahmed Hamza Yache
Abstract:
The main purpose of this study is to estimate the distribution of shear forces in building structures with asymmetries in the plan submitted to seismic forces can cause, in this case, simultaneous deformations of translation and torsion. To this end, the distribution of shear forces is obtained by seismic forces calculated from the equivalent static method of the Algerian earthquake code RPA 99 (2003 version) and spectral modal analysis for an irregular building plan without kinks. Comparison of the results obtained by these two methods used to highlight the difference in terms of distributions of shear forces in such structures.Keywords: structure, irregular, code, seismic, method, force, period
Procedia PDF Downloads 58517000 Solving Fuzzy Multi-Objective Linear Programming Problems with Fuzzy Decision Variables
Authors: Mahnaz Hosseinzadeh, Aliyeh Kazemi
Abstract:
In this paper, a method is proposed for solving Fuzzy Multi-Objective Linear Programming problems (FMOLPP) with fuzzy right hand side and fuzzy decision variables. To illustrate the proposed method, it is applied to the problem of selecting suppliers for an automotive parts producer company in Iran in order to find the number of optimal orders allocated to each supplier considering the conflicting objectives. Finally, the obtained results are discussed.Keywords: fuzzy multi-objective linear programming problems, triangular fuzzy numbers, fuzzy ranking, supplier selection problem
Procedia PDF Downloads 38316999 An Engineering Application of the H-P Version of the Finite Element Method on Vibration Behavior of Rotors
Authors: Hadjoui Abdelhamid, Saimi Ahmed
Abstract:
The hybrid h-p finite element method for the dynamic behavior of nonlinear rotors is described in this paper. The standard h-version method of discretizing the problem is retained, but modified to allow the use of polynomially-enriched beam elements. A hierarchically enriching element will thus not affect the nodal displacement and rotation, but will influence the values of the nodal bending moment and shear force is used. The deterministic movements of rotation and translation of the support which are coupled to the excitations due to unbalance are also taken into account. We study also the geometric dissymmetry of the shaft and the disc, thus the equations of motion of the rotor contain variable parametric coefficients over time that can lead to a lateral dynamic instability. The effects of movements combined support for bearings are analyzed and discussed through Campbell diagrams and spectral analyses. A program is made in Matlab. After validation of the program, several examples are studied. The influence of physical and geometric parameters on the natural frequencies of the shaft is determined through the study of these examples. Among these parameters, we include the variation in the diameter and the thickness of the rotor, the position of the disc.Keywords: Campbell diagram, critical speeds, nonlinear rotor, version h-p of FEM
Procedia PDF Downloads 23316998 A Nonlinear Parabolic Partial Differential Equation Model for Image Enhancement
Authors: Tudor Barbu
Abstract:
We present a robust nonlinear parabolic partial differential equation (PDE)-based denoising scheme in this article. Our approach is based on a second-order anisotropic diffusion model that is described first. Then, a consistent and explicit numerical approximation algorithm is constructed for this continuous model by using the finite-difference method. Finally, our restoration experiments and method comparison, which prove the effectiveness of this proposed technique, are discussed in this paper.Keywords: anisotropic diffusion, finite differences, image denoising and restoration, nonlinear PDE model, anisotropic diffusion, numerical approximation schemes
Procedia PDF Downloads 31316997 Smaa-Gaia: A Complementary Tool of the Smaa-Promethee Method
Authors: Y. de Smet, J. Hubinont
Abstract:
PROMETHEE and GAIA are well-known Multiple Criteria Decision Aid methods. Given an evaluation table and preference parameters they allow to rank the alternatives, to visualize the problem, to perform sensitivity and robustness analysis, etc. Unfortunately, it is often hard for the Decision Maker (DM) to estimate the precise values of these parameters. Therefore an alternative option is to give ranges of potential values in order to apply Stochastic Multicriteria Acceptability Analysis. This has been recently studied in the context of the SMAA-PROMETHEE method. The aim of this contribution is to propose an SMAA extension of GAIA. We show how this tool can be useful and provide complementary information to SMAA-PROMETHEE. This is illustrated on a pedagogical example.Keywords: multiple criteria decision making, PROMETHEE, GAIA, SMAA
Procedia PDF Downloads 42916996 A Concept in Addressing the Singularity of the Emerging Universe
Authors: Mahmoud Reza Hosseini
Abstract:
The universe is in a continuous expansion process, resulting in the reduction of its density and temperature. Also, by extrapolating back from its current state, the universe at its early times has been studied known as the big bang theory. According to this theory, moments after creation, the universe was an extremely hot and dense environment. However, its rapid expansion due to nuclear fusion led to a reduction in its temperature and density. This is evidenced through the cosmic microwave background and the universe structure at a large scale. However, extrapolating back further from this early state reaches singularity which cannot be explained by modern physics and the big bang theory is no longer valid. In addition, one can expect a nonuniform energy distribution across the universe from a sudden expansion. However, highly accurate measurements reveal an equal temperature mapping across the universe which is contradictory to the big bang principles. To resolve this issue, it is believed that cosmic inflation occurred at the very early stages of the birth of the universe According to the cosmic inflation theory, the elements which formed the universe underwent a phase of exponential growth due to the existence of a large cosmological constant. The inflation phase allows the uniform distribution of energy so that an equal maximum temperature could be achieved across the early universe. Also, the evidence of quantum fluctuations of this stage provides a means for studying the types of imperfections the universe would begin with. Although well-established theories such as cosmic inflation and the big bang together provide a comprehensive picture of the early universe and how it evolved into its current state, they are unable to address the singularity paradox at the time of universe creation. Therefore, a practical model capable of describing how the universe was initiated is needed. This research series aims at addressing the singularity issue by introducing an energy conversion mechanism. This is accomplished by establishing a state of energy called a “neutral state”, with an energy level which is referred to as “base energy” capable of converting into other states. Although it follows the same principles, the unique quanta state of the base energy allows it to be distinguishable from other states and have a uniform distribution at the ground level. Although the concept of base energy can be utilized to address the singularity issue, to establish a complete picture, the origin of the base energy should be also identified. This matter is the subject of the first study in the series “A Conceptual Study for Investigating the Creation of Energy and Understanding the Properties of Nothing” which is discussed in detail. Therefore, the proposed concept in this research series provides a road map for enhancing our understating of the universe's creation from nothing and its evolution and discusses the possibility of base energy as one of the main building blocks of this universe.Keywords: big bang, cosmic inflation, birth of universe, energy creation
Procedia PDF Downloads 8916995 Comparison of Allelopathic Activity of Some Edible Mushroom and Wild Mushroom in Japan
Authors: Asma Osivand, Hossein Mardani, Hiroshi Araya, Yoshiharu Fujii
Abstract:
Wild mushrooms have always been considered as valuable source of bioactive compounds, while edible mushrooms have been known for their importance as food source. However, their interaction with plants through chemicals that could lead to find new biochemical have not been well undertaken. A special bioassay method (Sandwich method) was applied to compare eight common edible mushrooms (Pleurotus eryngii, Pleurotus citrinopileatus, Pleurotus ostreatus, Lentinula edodes, Grifola frondosa, Flammulina velutipes, Hypsizygus tessellatus and Pholiota namako) with some wild species (Ganoderma appelanatum, Amanita pantherina, Artomyces pyxidatus, Morchella conica, Tricholosporum porphyrophyllum, Trametes hirsuta) for their phytotoxicity against lettuce. Among all tested edible mushrooms, application of 5 mg of P. ostreatus showed stronger allelopathic activity by inhibiting the growth of radicle and hypocotyl of lettuce by 84% and 63% respectively. Moreover, same amount of T. porphyrophyllum exerted 77% and 67% growth inhibition on radicle and hypocotyl of lettuce. In general, biochemicals contributed in tested mushrooms could be the main cause for their inhibitory activity and could lead to find new allelochemicals.Keywords: allelopathy, interaction, mushroom, phytotoxicity, Pleurotus sp., sandwich method
Procedia PDF Downloads 29216994 Structural Health Monitoring and Damage Structural Identification Using Dynamic Response
Authors: Reza Behboodian
Abstract:
Monitoring the structural health and diagnosing their damage in the early stages has always been one of the topics of concern. Nowadays, research on structural damage detection methods based on vibration analysis is very extensive. Moreover, these methods can be used as methods of permanent and timely inspection of structures and prevent further damage to structures. Non-destructive methods are the low-cost and economical methods for determining the damage of structures. In this research, a non-destructive method for detecting and identifying the failure location in structures based on dynamic responses resulting from time history analysis is proposed. When the structure is damaged due to the reduction of stiffness, and due to the applied loads, the displacements in different parts of the structure were increased. In the proposed method, the damage position is determined based on the calculation of the strain energy difference in each member of the damaged structure and the healthy structure at any time. Defective members of the structure are indicated by the amount of strain energy relative to the healthy state. The results indicated that the proper accuracy and performance of the proposed method for identifying failure in structures.Keywords: failure, time history analysis, dynamic response, strain energy
Procedia PDF Downloads 13316993 Packet Fragmentation Caused by Encryption and Using It as a Security Method
Authors: Said Rabah Azzam, Andrew Graham
Abstract:
Fragmentation of packets caused by encryption applied on the network layer of the IOS model in Internet Protocol version 4 (IPv4) networks as well as the possibility of using fragmentation and Access Control Lists (ACLs) as a method of restricting network access to certain hosts or areas of a network.Using default settings, fragmentation is expected to occur and each fragment to be reassembled at the other end. If this does not occur then a high number of ICMP messages should be generated back towards the source host indicating that the packet is too large and that it needs to be made smaller. This result is also expected when the MTU is changed for certain links between devices.When using ACLs and packet fragments to restrict access to hosts or network segments it is possible that ACLs cannot be set up in this way. If ACLs cannot be setup to allow only fragments then it is a limitation of the hardware’s firmware holding back this particular method. If the ACL on the restricted switch can be set up in such a way to allow only fragments then a connection that forces packets to fragment should be allowed to pass through the ACL. This should then make a network connection to the destination machine allowing data to be sent to and from the destination machine. ICMP messages from the restricted access switch and host should also be blocked from being sent back across the link which will be shown in an SSH session into the switch.Keywords: fragmentation, encryption, security, switch
Procedia PDF Downloads 33616992 Use of Fabric Phase Sorptive Extraction with Gas Chromatography-Mass Spectrometry for the Determination of Organochlorine Pesticides in Various Aqueous and Juice Samples
Authors: Ramandeep Kaur, Ashok Kumar Malik
Abstract:
Fabric Phase Sorptive Extraction (FPSE) combined with Gas chromatography Mass Spectrometry (GCMS) has been developed for the determination of nineteen organochlorine pesticides in various aqueous samples. The method consolidates the features of sol-gel derived microextraction sorbents with rich surface chemistry of cellulose fabric substrate which could directly extract sample from complex sample matrices and incredibly improve the operation with decreased pretreatment time. Some vital parameters such as kind and volume of extraction solvent and extraction time were examinedand optimized. Calibration curves were obtained in the concentration range 0.5-500 ng/mL. Under the optimum conditions, the limits of detection (LODs) were in the range 0.033 ng/mL to 0.136 ng/mL. The relative standard deviations (RSDs) for extraction of 10 ng/mL 0f OCPs were less than 10%. The developed method has been applied for the quantification of these compounds in aqueous and fruit juice samples. The results obtained proved the present method to be rapid and feasible for the determination of organochlorine pesticides in aqueous samples.Keywords: fabric phase sorptive extraction, gas chromatography-mass spectrometry, organochlorine pesticides, sample pretreatment
Procedia PDF Downloads 48416991 Chemical Reaction Method for Growing Uniform Photomechanical Organic Crystlas
Authors: Rabih O. Al-Kaysi, Lingyan Zhu, Muhannah K. Al-Muhannah, Christopher J. Bardeen
Abstract:
(E)-3-(Anthracen-9-yl)acrylic acid (9-AYAA) 1 exhibits a strong photomechanical response in bulk crystals but is challenging to grow in microcrystalline form. High quality microcrystals of this molecule could not be grown using techniques like sublimation, reprecipitation, and the floating drop method. If the tertbutyl ester of 9-AYAA is used as a starting material, however, high quality, size-uniform microwires could be grown via acid catalyzed hydrolysis. 9-AYAA microwires with uniform length and thickness were produced after a suspension of (E)-tert-butyl 3-(anthracen-9-yl)acrylate ester 2 microparticles was tumble-mixed in a mixture of phosphoric acid and sodium dodecyl sulfate at 35 °C. The dependence of the results on temperature, surfactant and precursor concentration, and mixing mode was investigated. This chemical reaction-growth method was extended to grow microplates of 9-anthraldehyde 3 using the corresponding acylal 4 as the starting material. Under 475 nm irradiation, the 9-AYAA microwires undergo a photoinduced coiling–uncoiling transition, while the 9-anthraldehyde microplates undergo a folding–unfolding transition.Keywords: photomechanical, surfactant, organic crystals, uniform
Procedia PDF Downloads 40216990 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater
Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj
Abstract:
In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation
Procedia PDF Downloads 7016989 Times2D: A Time-Frequency Method for Time Series Forecasting
Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan
Abstract:
Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation
Procedia PDF Downloads 4216988 Forecasting of the Mobility of Rainfall-Induced Slow-Moving Landslides Using a Two-Block Model
Authors: Antonello Troncone, Luigi Pugliese, Andrea Parise, Enrico Conte
Abstract:
The present study deals with the landslides periodically reactivated by groundwater level fluctuations owing to rainfall. The main type of movement which generally characterizes these landslides consists in sliding with quite small-displacement rates. Another peculiar characteristic of these landslides is that soil deformations are essentially concentrated within a thin shear band located below the body of the landslide, which, consequently, undergoes an approximately rigid sliding. In this context, a simple method is proposed in the present study to forecast the movements of this type of landslides owing to rainfall. To this purpose, the landslide body is schematized by means of a two-block model. Some analytical solutions are derived to relate rainfall measurements with groundwater level oscillations and these latter, in turn, to landslide mobility. The proposed method is attractive for engineering applications since it requires few parameters as input data, many of which can be obtained from conventional geotechnical tests. To demonstrate the predictive capability of the proposed method, the application to a well-documented landslide periodically reactivated by rainfall is shown.Keywords: rainfall, water level fluctuations, landslide mobility, two-block model
Procedia PDF Downloads 12116987 Static Modeling of the Delamination of a Composite Material Laminate in Mode II
Authors: Y. Madani, H. Achache, B. Boutabout
Abstract:
The purpose of this paper is to analyze numerically by the three-dimensional finite element method, using ABAQUS calculation code, the mechanical behavior of a unidirectional and multidirectional delaminated stratified composite under mechanical loading in Mode II. This study consists of the determination of the energy release rate G in mode II as well as the distribution of equivalent von Mises stresses along the damaged zone by varying several parameters such as the applied load and the delamination length. It allowed us to deduce that the high energy release rate favors delamination at the free edges of a stratified plate subjected to bending.Keywords: delamination, energy release rate, finite element method, stratified composite
Procedia PDF Downloads 17616986 Grey Prediction of Atmospheric Pollutants in Shanghai Based on GM(1,1) Model Group
Authors: Diqin Qi, Jiaming Li, Siman Li
Abstract:
Based on the use of the three-point smoothing method for selectively processing original data columns, this paper establishes a group of grey GM(1,1) models to predict the concentration ranges of four major air pollutants in Shanghai from 2023 to 2024. The results indicate that PM₁₀, SO₂, and NO₂ maintain the national Grade I standards, while the concentration of PM₂.₅ has decreased but still remains within the national Grade II standards. Combining the forecast results, recommendations are provided for the Shanghai municipal government's efforts in air pollution prevention and control.Keywords: atmospheric pollutant prediction, Grey GM(1, 1), model group, three-point smoothing method
Procedia PDF Downloads 3516985 The Location of Park and Ride Facilities Using the Fuzzy Inference Model
Authors: Anna Lower, Michal Lower, Robert Masztalski, Agnieszka Szumilas
Abstract:
Contemporary cities are facing serious congestion and parking problems. In urban transport policy the introduction of the park and ride system (P&R) is an increasingly popular way of limiting vehicular traffic. The determining of P&R facilities location is a key aspect of the system. Criteria for assessing the quality of the selected location are formulated generally and descriptively. The research outsourced to specialists are expensive and time consuming. The most focus is on the examination of a few selected places. The practice has shown that the choice of the location of these sites in a intuitive way without a detailed analysis of all the circumstances, often gives negative results. Then the existing facilities are not used as expected. Methods of location as a research topic are also widely taken in the scientific literature. Built mathematical models often do not bring the problem comprehensively, e.g. assuming that the city is linear, developed along one important communications corridor. The paper presents a new method where the expert knowledge is applied to fuzzy inference model. With such a built system even a less experienced person could benefit from it, e.g. urban planners, officials. The analysis result is obtained in a very short time, so a large number of the proposed location can also be verified in a short time. The proposed method is intended for testing of car parks location in a city. The paper will show selected examples of locations of the P&R facilities in cities planning to introduce the P&R. The analysis of existing objects will also be shown in the paper and they will be confronted with the opinions of the system users, with particular emphasis on unpopular locations. The research are executed using the fuzzy inference model which was built and described in more detail in the earlier paper of the authors. The results of analyzes are compared to documents of P&R facilities location outsourced by the city and opinions of existing facilities users expressed on social networking sites. The research of existing facilities were conducted by means of the fuzzy model. The results are consistent with actual users feedback. The proposed method proves to be good, but does not require the involvement of a large experts team and large financial contributions for complicated research. The method also provides an opportunity to show the alternative location of P&R facilities. The performed studies show that the method has been confirmed. The method can be applied in urban planning of the P&R facilities location in relation to the accompanying functions. Although the results of the method are approximate, they are not worse than results of analysis of employed experts. The advantage of this method is ease of use, which simplifies the professional expert analysis. The ability of analyzing a large number of alternative locations gives a broader view on the problem. It is valuable that the arduous analysis of the team of people can be replaced by the model's calculation. According to the authors, the proposed method is also suitable for implementation on a GIS platform.Keywords: fuzzy logic inference, park and ride system, P&R facilities, P&R location
Procedia PDF Downloads 32516984 Risk Management and Security Practice in Customs Supply Chain: Application of Cross ABC Method to the Moroccan Customs
Authors: Lamia Hammadi, Abdellah Ait Ouhman, Aomar Ibourk
Abstract:
It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.Keywords: cross ABC method, customs supply chain, risk, risk management
Procedia PDF Downloads 37916983 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System
Authors: Nareshkumar Harale, B. B. Meshram
Abstract:
The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design
Procedia PDF Downloads 22716982 Marine Propeller Cavitation Analysis Using BEM
Authors: Ehsan Yari
Abstract:
In this paper, a numerical study of sheet cavitation has been performed on DTMB4119 and E779A marine propellers with the boundary element method. In propeller design, various parameters of geometry and fluid are incorporated. So a program is needed to solve the flow taking the whole parameters changing into account. The capability of analyzing the wetted and cavitation flow around propellers in steady, unsteady, uniform, and non-uniform conditions while decreasing computational time compared to numerical finite volume methods with acceptable precision are the characteristic features of the present method. Moreover, modifying the position of the detachment point and its corresponding potential value has been considered. Numerical results have been validated with experimental data, showing a good conformation.Keywords: cavitation, BEM, DTMB4119, E779A
Procedia PDF Downloads 6916981 Mapping of Arenga Pinnata Tree Using Remote Sensing
Authors: Zulkiflee Abd Latif, Sitinor Atikah Nordin, Alawi Sulaiman
Abstract:
Different tree species possess different and various benefits. Arenga Pinnata tree species own several potential uses that is valuable for the economy and the country. Mapping vegetation using remote sensing technique involves various process, techniques and consideration. Using satellite imagery, this method enables the access of inaccessible area and with the availability of near infra-red band; it is useful in vegetation analysis, especially in identifying tree species. Pixel-based and object-based classification technique is used as a method in this study. Pixel-based classification technique used in this study divided into unsupervised and supervised classification. Object based classification technique becomes more popular another alternative method in classification process. Using spectral, texture, color and other information, to classify the target make object-based classification is a promising technique for classification. Classification of Arenga Pinnata trees is overlaid with elevation, slope and aspect, soil and river data and several other data to give information regarding the tree character and living environment. This paper will present the utilization of remote sensing technique in order to map Arenga Pinnata tree speciesKeywords: Arenga Pinnata, pixel-based classification, object-based classification, remote sensing
Procedia PDF Downloads 38016980 Redefining Success Beyond Borders: A Deep Dive into Effective Methods to Boost Morale Among Virtual Workers for Exponential Project Performance
Authors: Florence Ibeh, David Oyewmi Oyekunle, David Boohene
Abstract:
The continuous advancement of information technology has completely transformed how businesses and organizations operate on a global scale. The widespread availability of virtual communication tools enables individuals to opt for remote work. While remote employment offers various benefits, such as facilitating corporate growth and enhancing customer support, it also presents distinct challenges. Therefore, investigating the intricacies of virtual team morale is crucial for ensuring the achievement of project objectives. For this study, content analysis of pre-existing secondary data was employed to examine the phenomenon. Essential elements vital for improving the success of projects within virtual teams were identified. These factors include technology adoption, creating a distraction-free work environment, effective leadership, trust-building, clear communication channels, well-defined task allocation, active team participation, and motivation. Furthermore, the study established a substantial correlation between morale levels and the participation and productivity of virtual team members. Higher levels of morale were associated with optimal performance among virtual teams. The study determined that the key factors for enhancing project performance in virtual teams are the adoption of technology, a focused environment, effective leadership, trust, communication, well-defined tasks, collaborative teamwork, and motivation. Additionally, the study discovered that modifying the optimal strategies employed by in-office teams can enhance the diminished morale prevalent in remote teams to sustain a high level of team morale for virtual teams. The findings of this study are highly significant in the dynamic field of project management. Currently, there is limited information regarding strategies that address challenges arising from external factors in virtual teams, such as ambient noise and disruptions caused by family members. The findings underscore the significance of selecting appropriate communication technologies, delineating distinct roles and responsibilities for virtual team members, and nurturing a culture of accountability and trust. Promoting seamless collaboration and instilling motivation among virtual team members are deemed highly effective in augmenting employee engagement and performance within virtual team setting.Keywords: virtual teams, morale, project performance, distract-free environment, technology adaptation
Procedia PDF Downloads 9516979 Hsa-miR-326 Functions as a Tumor Suppressor in Non-Small Cell Lung Cancer through Targeting CCND1
Authors: Cheng-Cao Sun, Shu-Jun Li, Cuili Yang, Yongyong Xi, Liang Wang, Feng Zhang, De-Jia Li
Abstract:
Hsa-miRNA-326 (miR-326) has recently been discovered having anticancer efficacy in different organs. However, the role of miR-326 on non-small cell lung cancer (NSCLC) is still ambiguous. In this study, we investigated the role of miR-326 on the development of NSCLC. The results indicated that miR-326 was significantly down-regulated in primary tumor tissues and very low levels were found in NSCLC cell lines. Ectopic expression of miR-326 in NSCLC cell lines significantly suppressed cell growth as evidenced by cell viability assay, colony formation assay and BrdU staining, through inhibition of cyclin D1, cyclin D2, CDK4, and up-regulation of p57(Kip2) and p21(Waf1/Cip1). In addition, miR-326 induced apoptosis, as indicated by concomitantly with up-regulation of key apoptosis protein cleaved caspase-3, and down-regulation of anti-apoptosis protein Bcl2. Moreover, miR-326 inhibited cellular migration and invasiveness through inhibition of matrix metalloproteinases (MMP)-7 and MMP-9. Further, oncogene CCND1 was revealed to be a putative target of miR-326, which was inversely correlated with miR-326 expression in NSCLC. Taken together, our results demonstrated that miR-326 played a pivotal role on NSCLC through inhibiting cell proliferation, migration, invasion, and promoting apoptosis by targeting oncogenic CCND1.Keywords: hsa-miRNA-326 (miR-326), cyclin D1, non-small cell lung cancer (NSCLC), proliferation, apoptosis
Procedia PDF Downloads 306