Search results for: syntactic complexity
905 Modeling the Performance of Natural Sand-Bentonite Barriers after Infiltration with Polar and Non-Polar Hydrocarbon Leachates
Authors: Altayeb Qasem, Mousa Bani Baker, Amani Nawafleh
Abstract:
The complexity of the sand-bentonite liner barrier system calls for an adequate model that reflects the conditions depending on the barrier materials and the characteristics of the permeates which lead to hydraulic conductivity changes when liners infiltrated with polar, no-polar, miscible and immiscible liquids. This paper is dedicated to developing a model for evaluating the hydraulic conductivity in the form of a simple indicator for the compatibility of the liner versus leachate. Based on two liner compositions (95% sand: 5% bentonite; and 90% sand: 10% bentonite), two pressures (40 kPa and 100 kPa), and three leachates: water, ethanol and biofuel. Two characteristics of the leacahtes were used: viscosity of permeate and its octanol-water partitioning coefficient (Kow). Three characteristics of the liners mixtures were evaluated which had impact on the hydraulic conductivity of the liner system: the initial content of bentonite (%), the free swelling index, and the shrinkage limit of the initial liner’s mixture. Engineers can use this modest tool to predict a potential liner failure in sand-bentonite barriers.Keywords: liner performance, sand-bentonite barriers, viscosity, free swelling index, shrinkage limit, octanol-water partitioning coefficient, hydraulic conductivity, theoretical modeling
Procedia PDF Downloads 411904 Using T-Splines to Model Point Clouds from Terrestrial Laser Scanner
Authors: G. Kermarrec, J. Hartmann
Abstract:
Spline surfaces are a major representation of freeform surfaces in the computer-aided graphic industry and were recently introduced in the field of geodesy for processing point clouds from terrestrial laser scanner (TLS). The surface fitting consists of approximating a trustworthy mathematical surface to a large numbered 3D point cloud. The standard B-spline surfaces lack of local refinement due to the tensor-product construction. The consequences are oscillating geometry, particularly in the transition from low-to-high curvature parts for scattered point clouds with missing data. More economic alternatives in terms of parameters on how to handle point clouds with a huge amount of observations are the recently introduced T-splines. As long as the partition of unity is guaranteed, their computational complexity is low, and they are flexible. T-splines are implemented in a commercial package called Rhino, a 3D modeler which is widely used in computer aided design to create and animate NURBS objects. We have applied T-splines surface fitting to terrestrial laser scanner point clouds from a bridge under load and a sheet pile wall with noisy observations. We will highlight their potential for modelling details with high trustworthiness, paving the way for further applications in terms of deformation analysis.Keywords: deformation analysis, surface modelling, terrestrial laser scanner, T-splines
Procedia PDF Downloads 138903 Offender Rehabilitation: The Middle Way of Maimonides to Mental and Social Health
Authors: Liron Hoch
Abstract:
Traditional religious and spiritual texts offer a surprising wealth of relevant theoretical and practical knowledge about human behavior. This wellspring may contribute significantly to expanding our current body of knowledge in the social sciences and criminology in particular. In Jewish religious texts, specifically by Maimonides, we can find profound analyses of human traits and guidelines for a normative way of life. Among other things, modern criminological literature attempts to link certain character traits and divergent behaviors. Using the hermeneutic phenomenological approach, we analyzed the writings of Maimonides, mainly Laws of Human Dispositions, in order to understand Moses ben Maimon's (1138–1204) view of character traits. The analysis yielded four themes: (1) Human personality between nature and nurture; (2) The complexity of human personality, imbalance and criminality; (3) Extremism as a way to achieve balance; and (4) The Middle Way, flexibility and common sense. These themes can serve therapeutic purposes, as well as inform a rehabilitation model. Grounded in a theoretical rationale about the nature of humans, this model is designed to direct individuals to balance their traits by self-reflection and constant practice of the Middle Way. The proposal we will present is that implementing this model may promote normative behavior and thus contribute to rehabilitating offenders.Keywords: rehabilitation, traits, offenders, maimonides, middle way
Procedia PDF Downloads 68902 Artificial Bee Colony Optimization for SNR Maximization through Relay Selection in Underlay Cognitive Radio Networks
Authors: Babar Sultan, Kiran Sultan, Waseem Khan, Ijaz Mansoor Qureshi
Abstract:
In this paper, a novel idea for the performance enhancement of secondary network is proposed for Underlay Cognitive Radio Networks (CRNs). In Underlay CRNs, primary users (PUs) impose strict interference constraints on the secondary users (SUs). The proposed scheme is based on Artificial Bee Colony (ABC) optimization for relay selection and power allocation to handle the highlighted primary challenge of Underlay CRNs. ABC is a simple, population-based optimization algorithm which attains global optimum solution by combining local search methods (Employed and Onlooker Bees) and global search methods (Scout Bees). The proposed two-phase relay selection and power allocation algorithm aims to maximize the signal-to-noise ratio (SNR) at the destination while operating in an underlying mode. The proposed algorithm has less computational complexity and its performance is verified through simulation results for a different number of potential relays, different interference threshold levels and different transmit power thresholds for the selected relays.Keywords: artificial bee colony, underlay spectrum sharing, cognitive radio networks, amplify-and-forward
Procedia PDF Downloads 581901 Enhancing Patch Time Series Transformer with Wavelet Transform for Improved Stock Prediction
Authors: Cheng-yu Hsieh, Bo Zhang, Ahmed Hambaba
Abstract:
Stock market prediction has long been an area of interest for both expert analysts and investors, driven by its complexity and the noisy, volatile conditions it operates under. This research examines the efficacy of combining the Patch Time Series Transformer (PatchTST) with wavelet transforms, specifically focusing on Haar and Daubechies wavelets, in forecasting the adjusted closing price of the S&P 500 index for the following day. By comparing the performance of the augmented PatchTST models with traditional predictive models such as Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformers, this study highlights significant enhancements in prediction accuracy. The integration of the Daubechies wavelet with PatchTST notably excels, surpassing other configurations and conventional models in terms of Mean Absolute Error (MAE) and Mean Squared Error (MSE). The success of the PatchTST model paired with Daubechies wavelet is attributed to its superior capability in extracting detailed signal information and eliminating irrelevant noise, thus proving to be an effective approach for financial time series forecasting.Keywords: deep learning, financial forecasting, stock market prediction, patch time series transformer, wavelet transform
Procedia PDF Downloads 48900 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines
Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.
Abstract:
Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition
Procedia PDF Downloads 574899 Banking and Accounting Analysis Researches Effect on Environment and Income
Authors: Gerges Samaan Henin Abdalla
Abstract:
Ultra-secured methods of banking services have been introduced to the customer, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. Consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development
Procedia PDF Downloads 44898 Improving Decision-Making in Multi-Project Environments within Organizational Information Systems Using Blockchain Technology
Authors: Seyed Hossein Iranmanesh, Hassan Nouri, Seyed Reza Iranmanesh
Abstract:
In the dynamic and complex landscape of today’s business, organizations often face challenges in impactful decision-making across multi-project settings. To efficiently allocate resources, coordinate tasks, and optimize project outcomes, establishing robust decision-making processes is essential. Furthermore, the increasing importance of information systems and their integration within organizational workflows introduces an additional layer of complexity. This research proposes the use of blockchain technology as a suitable solution to enhance decision-making in multi-project environments, particularly within the realm of information systems. The conceptual framework in this study comprises four independent variables and one dependent variable. The identified independent variables for the targeted research include: Blockchain Layer in Integrated Systems, Quality of Generated Information ,User Satisfaction with Integrated Systems and Utilization of Integrated Systems. The project’s performance, considered as the dependent variable and moderated by organizational policies and procedures, reflects the impact of blockchain technology adoption on organizational effectiveness1. The results highlight the significant influence of blockchain implementation on organizational performance.Keywords: multi-project environments, decision support systems, information systems, blockchain technology, decentralized systems.
Procedia PDF Downloads 55897 Explicit Numerical Approximations for a Pricing Weather Derivatives Model
Authors: Clarinda V. Nhangumbe, Ercília Sousa
Abstract:
Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives
Procedia PDF Downloads 82896 Aqueous Two Phase Extraction of Jonesia denitrificans Xylanase 6 in PEG 1000/Phosphate System
Authors: Nawel Boucherba, Azzedine Bettache, Abdelaziz Messis, Francis Duchiron, Said Benallaoua
Abstract:
The impetus for research in the field of bioseparation has been sparked by the difficulty and complexity in the downstream processing of biological products. Indeed, 50% to 90% of the production cost for a typical biological product resides in the purification strategy. There is a need for efficient and economical large scale bioseparation techniques which will achieve high purity and high recovery while maintaining the biological activity of the molecule. One such purification technique which meets these criteria involves the partitioning of biomolecules between two immiscible phases in an aqueous system (ATPS). The Production of xylanases is carried out in 500ml of a liquid medium based on birchwood xylan. In each ATPS, PEG 1000 is added to a mixture consisting of dipotassium phosphate, sodium chloride and the culture medium inoculated with the strain Jonesia denitrificans, the mixture was adjusted to different pH. The concentration of PEG 1000 was varied: 8 to 16 % and the NaCl percentages are also varied from 2 to 4% while maintaining the other parameters constant. The results showed that the best ATPS for purification of xylanases is composed of PEG 1000 at 8.33%, 13.14 % of K2HPO4, 1.62% NaCl at pH 7. We obtained a yield of 96.62 %, a partition coefficient of 86.66 and a purification factor of 2.9. The zymogram showed that the activity is mainly detected in the top phase.Keywords: Jonesia denitrificans BN13, xylanase, aqueous two phases system, zymogram
Procedia PDF Downloads 397895 Woman, House, Identity: The Study of the Role of House in Constructing the Contemporary Dong Minority Woman’s Identity
Authors: Sze Wai Veera Fung, Peter W. Ferretto
Abstract:
Similar to most ethnic groups in China, men of the Dong minority hold the primary position in policymaking, moral authority, social values, and the control of the property. As the spatial embodiment of the patriarchal ideals, the house plays a significant role in producing and reproducing the distinctive gender status within the Dong society. Nevertheless, Dong women do not see their home as a cage of confinement, nor do they see themselves as a victim of oppression. For these women with reference to their productive identity, a house is a dwelling place with manifold meanings, including a proof of identity, an economic instrument, and a public resource operating on the community level. This paper examines the role of the house as a central site for identity construction and maintenance for the southern dialect Dong minority women in Hunan, China. Drawing on recent interviews with the Dong women, this study argues that women as productive individuals have a strong influence on the form of their house and the immediate environment, regardless of the male-dominated social construct of the Dong society. The aim of this study is not to produce a definitive relationship between women, house, and identity. Rather, it seeks to offer an alternative lens into the complexity and diversity of gender dynamics operating in and beyond the boundary of the house in the context of contemporary rural China.Keywords: conception of home, Dong minority, house, rural China, woman’s identity
Procedia PDF Downloads 137894 A Simple Computational Method for the Gravitational and Seismic Soil-Structure-Interaction between New and Existent Buildings Sites
Authors: Nicolae Daniel Stoica, Ion Mierlus Mazilu
Abstract:
This work is one of numerical research and aims to address the issue of the design of new buildings in a 3D location of existing buildings. In today's continuous development and congestion of urban centers is a big question about the influence of the new buildings on an already existent vicinity site. Thus, in this study, we tried to focus on how existent buildings may be affected by any newly constructed buildings and in how far this influence is really decreased. The problem of modeling the influence of interaction between buildings is not simple in any area in the world, and neither in Romania. Unfortunately, most often the designers not done calculations that can determine how close to reality these 3D influences nor the simplified method and the more superior methods. In the most literature making a "shield" (the pilots or molded walls) is absolutely sufficient to stop the influence between the buildings, and so often the soil under the structure is ignored in the calculation models. The main causes for which the soil is neglected in the analysis are related to the complexity modeling of interaction between soil and structure. In this paper, based on a new simple but efficient methodology we tried to determine for a lot of study cases the influence, in terms of assessing the interaction land structure on the behavior of structures that influence a new building on an existing one. The study covers additional subsidence that may occur during the execution of new works and after its completion. It also highlighted the efforts diagrams and deflections in the soil for both the original case and the final stage. This is necessary to see to what extent the expected impact of the new building on existing areas.Keywords: soil, structure, interaction, piles, earthquakes
Procedia PDF Downloads 290893 Proposal of Optimality Evaluation for Quantum Secure Communication Protocols by Taking the Average of the Main Protocol Parameters: Efficiency, Security and Practicality
Authors: Georgi Bebrov, Rozalina Dimova
Abstract:
In the field of quantum secure communication, there is no evaluation that characterizes quantum secure communication (QSC) protocols in a complete, general manner. The current paper addresses the problem concerning the lack of such an evaluation for QSC protocols by introducing an optimality evaluation, which is expressed as the average over the three main parameters of QSC protocols: efficiency, security, and practicality. For the efficiency evaluation, the common expression of this parameter is used, which incorporates all the classical and quantum resources (bits and qubits) utilized for transferring a certain amount of information (bits) in a secure manner. By using criteria approach whether or not certain criteria are met, an expression for the practicality evaluation is presented, which accounts for the complexity of the QSC practical realization. Based on the error rates that the common quantum attacks (Measurement and resend, Intercept and resend, probe attack, and entanglement swapping attack) induce, the security evaluation for a QSC protocol is proposed as the minimum function taken over the error rates of the mentioned quantum attacks. For the sake of clarity, an example is presented in order to show how the optimality is calculated.Keywords: quantum cryptography, quantum secure communcation, quantum secure direct communcation security, quantum secure direct communcation efficiency, quantum secure direct communcation practicality
Procedia PDF Downloads 182892 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs
Authors: Malo Pocheau-Lesteven, Olivier Le Maître
Abstract:
Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program
Procedia PDF Downloads 156891 DesignChain: Automated Design of Products Featuring a Large Number of Variants
Authors: Lars Rödel, Jonas Krebs, Gregor Müller
Abstract:
The growing price pressure due to the increasing number of global suppliers, the growing individualization of products and ever-shorter delivery times are upcoming challenges in the industry. In this context, Mass Personalization stands for the individualized production of customer products in batch size 1 at the price of standardized products. The possibilities of digitalization and automation of technical order processing open up the opportunity for companies to significantly reduce their cost of complexity and lead times and thus enhance their competitiveness. Many companies already use a range of CAx tools and configuration solutions today. Often, the expert knowledge of employees is hidden in "knowledge silos" and is rarely networked across processes. DesignChain describes the automated digital process from the recording of individual customer requirements, through design and technical preparation, to production. Configurators offer the possibility of mapping variant-rich products within the Design Chain. This transformation of customer requirements into product features makes it possible to generate even complex CAD models, such as those for large-scale plants, on a rule-based basis. With the aid of an automated CAx chain, production-relevant documents are thus transferred digitally to production. This process, which can be fully automated, allows variants to always be generated on the basis of current version statuses.Keywords: automation, design, CAD, CAx
Procedia PDF Downloads 74890 An Approach of Node Model TCnNet: Trellis Coded Nanonetworks on Graphene Composite Substrate
Authors: Diogo Ferreira Lima Filho, José Roberto Amazonas
Abstract:
Nanotechnology opens the door to new paradigms that introduces a variety of novel tools enabling a plethora of potential applications in the biomedical, industrial, environmental, and military fields. This work proposes an integrated node model by applying the same concepts of TCNet to networks of nanodevices where the nodes are cooperatively interconnected with a low-complexity Mealy Machine (MM) topology integrating in the same electronic system the modules necessary for independent operation in wireless sensor networks (WSNs), consisting of Rectennas (RF to DC power converters), Code Generators based on Finite State Machine (FSM) & Trellis Decoder and On-chip Transmit/Receive with autonomy in terms of energy sources applying the Energy Harvesting technique. This approach considers the use of a Graphene Composite Substrate (GCS) for the integrated electronic circuits meeting the following characteristics: mechanical flexibility, miniaturization, and optical transparency, besides being ecological. In addition, graphene consists of a layer of carbon atoms with the configuration of a honeycomb crystal lattice, which has attracted the attention of the scientific community due to its unique Electrical Characteristics.Keywords: composite substrate, energy harvesting, finite state machine, graphene, nanotechnology, rectennas, wireless sensor networks
Procedia PDF Downloads 104889 Transitioning Teacher Identity during COVID-19: An Australian Early Childhood Education Perspective
Authors: J. Jebunnesa, Y. Budd, T. Mason
Abstract:
COVID-19 changed the pedagogical expectations of early childhood education as many teachers across Australia had to quickly adapt to new teaching practices such as remote teaching. An important factor in the successful implementation of any new teaching and learning approach is teacher preparation, however, due to the pandemic, the transformation to remote teaching was immediate. A timely question to be asked is how early childhood teachers managed the transition from face-to-face teaching to remote teaching and what was learned through this time. This study explores the experiences of early childhood educators in Australia during COVID-19 lockdowns. Data were collected from an online survey conducted through the official Facebook forum of “Early Childhood Education and Care Australia,” and a constructivist grounded theory methodology was used to analyse the data. Initial research results suggest changing expectations of teachers’ roles and responsibilities during the lockdown, with a significant category related to transitioning teacher identities emerging. The concept of transitioning represents the shift from the role of early childhood educator to educational innovator, essential worker, social worker, and health officer. The findings illustrate the complexity of early childhood educators’ roles during the pandemic.Keywords: changing role of teachers, constructivist grounded theory, lessons learned, teaching during COVID-19
Procedia PDF Downloads 97888 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia PDF Downloads 127887 Development of Tourism Infrastructure and Cultural Heritage: Case of Gobustan Preserve
Authors: Rufat Nuriyev
Abstract:
Located in the eastern part of the Republic of Azerbaijan and on the western shore of the Caspian Sea, Gobustan National Reserve was inscribed as Gobustan Rock Art Cultural Landscape into the World Heritage List in 2007. Gobustan is an outstanding rock art landscape, where over 6000 rock engravings were found and registered, since the end of Upper Paleolithic up to the Middle Ages. Being a rock art center, the Gobustan seeks to stimulate public awareness and disseminate knowledge of prehistoric art to enrich educational, cultural and artistic communities regionally, nationally and internationally. Due to the Decree of the President of the Republic of Azerbaijan and the “Action Plan” , planned actions started to realize. Some of them implemented before of stipulated date. For the attraction of visitors and improvement of service quality in the museum-reserve, various activities are organized. The building of a new museum center at the foot of the Beyukdash Mountain has been completed in 2011. Main aims of the new museum building and exhibition was to provide better understanding of the importance of this monument for local community, Azerbaijanian culture and the world. In the Petroglyph Museum at Gobustan, digital and traditional media are closely integrated to reveal the complexity of historical, cultural and artistic meaning of prehistoric rock carvings of Gobustan. Alongside with electronic devices, the visitor gets opportunity of direct contact with artifacts and ancient rock carvings.Keywords: Azerbaijan, Gobustan, rock art, museum
Procedia PDF Downloads 299886 Remotely Sensed Data Fusion to Extract Vegetation Cover in the Cultural Park of Tassili, South of Algeria
Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur
Abstract:
The cultural park of the Tassili, occupying a large area of Algeria, is characterized by a rich vegetative biodiversity to be preserved and managed both in time and space. The management of a large area (case of Tassili), by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information etc.), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Multispectral imaging sensors have been very useful in the last decade in very interesting applications of remote sensing. They can aid in several domains such as the de¬tection and identification of diverse surface targets, topographical details, and geological features. In this work, we try to extract vegetative areas using fusion techniques between data acquired from sensor on-board the Earth Observing 1 (EO-1) satellite and Landsat ETM+ and TM sensors. We have used images acquired over the Oasis of Djanet in the National Park of Tassili in the south of Algeria. Fusion technqiues were applied on the obtained image to extract the vegetative fraction of the different classes of land use. We compare the obtained results in vegetation end member extraction with vegetation indices calculated from both Hyperion and other multispectral sensors.Keywords: Landsat ETM+, EO1, data fusion, vegetation, Tassili, Algeria
Procedia PDF Downloads 432885 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network
Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui
Abstract:
Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN
Procedia PDF Downloads 130884 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses
Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau
Abstract:
Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.Keywords: exam length, psychometric criteria, synthetic experimental designs, test length
Procedia PDF Downloads 271883 The Ontological Memory in Bergson as a Conceptual Tool for the Analysis of the Digital Conjuncture
Authors: Douglas Rossi Ramos
Abstract:
The current digital conjuncture, called by some authors as 'Internet of Things' (IoT), 'Web 2.0' or even 'Web 3.0', consists of a network that encompasses any communication of objects and entities, such as data, information, technologies, and people. At this juncture, especially characterized by an "object socialization," communication can no longer be represented as a simple informational flow of messages from a sender, crossing a channel or medium, reaching a receiver. The idea of communication must, therefore, be thought of more broadly in which it is possible to analyze the process communicative from interactions between humans and nonhumans. To think about this complexity, a communicative process that encompasses both humans and other beings or entities communicating (objects and things), it is necessary to constitute a new epistemology of communication to rethink concepts and notions commonly attributed to humans such as 'memory.' This research aims to contribute to this epistemological constitution from the discussion about the notion of memory according to the complex ontology of Henri Bergson. Among the results (the notion of memory in Bergson presents itself as a conceptual tool for the analysis of posthumanism and the anthropomorphic conjuncture of the new advent of digital), there was the need to think about an ontological memory, analyzed as a being itself (being itself of memory), as a strategy for understanding the forms of interaction and communication that constitute the new digital conjuncture, in which communicating beings or entities tend to interact with each other. Rethinking the idea of communication beyond the dimension of transmission in informative sequences paves the way for an ecological perspective of the digital dwelling condition.Keywords: communication, digital, Henri Bergson, memory
Procedia PDF Downloads 161882 Iterative Dynamic Programming for 4D Flight Trajectory Optimization
Authors: Kawser Ahmed, K. Bousson, Milca F. Coelho
Abstract:
4D flight trajectory optimization is one of the key ingredients to improve flight efficiency and to enhance the air traffic capacity in the current air traffic management (ATM). The present paper explores the iterative dynamic programming (IDP) as a potential numerical optimization method for 4D flight trajectory optimization. IDP is an iterative version of the Dynamic programming (DP) method. Due to the numerical framework, DP is very suitable to deal with nonlinear discrete dynamic systems. The 4D waypoint representation of the flight trajectory is similar to the discretization by a grid system; thus DP is a natural method to deal with the 4D flight trajectory optimization. However, the computational time and space complexity demanded by the DP is enormous due to the immense number of grid points required to find the optimum, which prevents the use of the DP in many practical high dimension problems. On the other hand, the IDP has shown potentials to deal successfully with high dimension optimal control problems even with a few numbers of grid points at each stage, which reduces the computational effort over the traditional DP approach. Although the IDP has been applied successfully in chemical engineering problems, IDP is yet to be validated in 4D flight trajectory optimization problems. In this paper, the IDP has been successfully used to generate minimum length 4D optimal trajectory avoiding any obstacle in its path, such as a no-fly zone or residential areas when flying in low altitude to reduce noise pollution.Keywords: 4D waypoint navigation, iterative dynamic programming, obstacle avoidance, trajectory optimization
Procedia PDF Downloads 160881 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 78880 Resilient Regions for Purpose of Crisis Management
Authors: Jana Gebhartova, Tomas Duda, Ivan Benes
Abstract:
World is characterized by constantly emerging new links, increasing complexity and speed of processes in the society. The globalized world needs (except political and financial mechanisms and institutions) functional supply chains. Transport and supply chains can be interrupted in case of natural disasters, conflicts and civil disorders, sudden demand shocks, export/import restrictions, terrorism. Long-term interruption of crucial services for human existence can results in breakdown of the whole society. If global supply chains can be interrupted, the ability to survive a crisis situation depends on local self-sufficiency, it means ensuring water, food and energy. In the world of 21st century, new way of thinking (based on the concept of resilience) is needed. Planning for self-sufficiency and resilience must be part of the agenda of local governments. The paper presents first results of research project VF20112015518 “Security of population – crisis management” that deals with issue of critical infrastructure, ensuring regional self-sufficiency in crisis situations and issues related to population protection and water, energy and food security. The project is being solved within Security Research of Ministry of the Interior of the Czech Republic in 2011-2015.Keywords: crisis management, resilience, indicators of self-sufficiency, continuity of supplies
Procedia PDF Downloads 377879 Promoting Diversity in Leadership: Exploring Women's Roles in Corporate Governance, with a Focus on Saudi Arabia
Authors: Norah Salem Al Mosa
Abstract:
This paper critically examines the ethical position of academic scholarship concerning "women in leadership" in Saudi Arabia, focusing on the context of the Saudi Vision 2030 initiative. While this vision places a strong emphasis on empowering women and increasing their presence in the workforce, women still face significant cultural, organisational, and personal barriers to leadership roles. The existing literature highlights the challenges Saudi women encounter, including the male guardianship system, and international perspectives add complexity to the issue. The debate among scholars about considering cultural context versus highlighting ongoing challenges is explored. The paper underscores that despite efforts to enhance women's representation in leadership positions, progress has been slow due to cultural norms, the absence of legal quotas, and limited access to education and professional development. It raises questions about the seriousness of research efforts and the government's commitment to gender equality in leadership roles, emphasising the need for increased academic scrutiny in this area. Ultimately, the paper aims to enhance understanding of the challenges and opportunities for women in leadership roles, their contributions to corporate governance in Saudi Arabia, and potential implications beyond its borders.Keywords: female directors, gender diversity, women on executive positions, Saudi vision 2030
Procedia PDF Downloads 59878 Leveraging Large Language Models to Build a Cutting-Edge French Word Sense Disambiguation Corpus
Authors: Mouheb Mehdoui, Amel Fraisse, Mounir Zrigui
Abstract:
With the increasing amount of data circulating over the Web, there is a growing need to develop and deploy tools aimed at unraveling semantic nuances within text or sentences. The challenges in extracting precise meanings arise from the complexity of natural language, while words usually have multiple interpretations depending on the context. The challenge of precisely interpreting words within a given context is what the task of Word Sense Disambiguation meets. It is a very old domain within the area of Natural Language Processing aimed at determining a word’s meaning that it is going to carry in a particular context, hence increasing the correctness of applications processing the language. Numerous linguistic resources are accessible online, including WordNet, thesauri, and dictionaries, enabling exploration of diverse contextual meanings. However, several limitations persist. These include the scarcity of resources for certain languages, a limited number of examples within corpora, and the challenge of accurately detecting the topic or context covered by text, which significantly impacts word sense disambiguation. This paper will discuss the different approaches to WSD and review corpora available for this task. We will contrast these approaches, highlighting the limitations, which will allow us to build a corpus in French, targeted for WSD.Keywords: semantic enrichment, disambiguation, context fusion, natural language processing, multilingual applications
Procedia PDF Downloads 3877 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm
Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta
Abstract:
Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates
Procedia PDF Downloads 235876 Investigation of Complexity Dynamics in a DC Glow Discharge Magnetized Plasma Using Recurrence Quantification Analysis
Authors: Vramori Mitra, Bornali Sarma, Arun K. Sarma
Abstract:
Recurrence is a ubiquitous feature of any real dynamical system. The states in phase space trajectory of a system have an inherent tendency to return to the same state or its close state after certain time laps. Recurrence quantification analysis technique, based on this fundamental feature of a dynamical system, detects evaluation of state under variation of control parameter of the system. The paper presents the investigation of nonlinear dynamical behavior of plasma floating potential fluctuations obtained by using a Langmuir probe in different magnetic field under the variation of discharge voltages. The main measures of recurrence quantification analysis are considered as determinism, linemax and entropy. The increment of the DET and linemax variables asserts that the predictability and periodicity of the system is increasing. The variable linemax indicates that the chaoticity is being diminished with the slump of magnetic field while increase of magnetic field enhancing the chaotic behavior. Fractal property of the plasma time series estimated by DFA technique (Detrended fluctuation analysis) reflects that long-range correlation of plasma fluctuations is decreasing while fractal dimension is increasing with the enhancement of magnetic field which corroborates the RQA analysis.Keywords: detrended fluctuation analysis, chaos, phase space, recurrence
Procedia PDF Downloads 326