Search results for: Cloud computing
961 Thermodynamics of Aqueous Solutions of Organic Molecule and Electrolyte: Use Cloud Point to Obtain Better Estimates of Thermodynamic Parameters
Authors: Jyoti Sahu, Vinay A. Juvekar
Abstract:
Electrolytes are often used to bring about salting-in and salting-out of organic molecules and polymers (e.g. polyethylene glycols/proteins) from the aqueous solutions. For quantification of these phenomena, a thermodynamic model which can accurately predict activity coefficient of electrolyte as a function of temperature is needed. The thermodynamics models available in the literature contain a large number of empirical parameters. These parameters are estimated using lower/upper critical solution temperature of the solution in the electrolyte/organic molecule at different temperatures. Since the number of parameters is large, inaccuracy can bethe creep in during their estimation, which can affect the reliability of prediction beyond the range in which these parameters are estimated. Cloud point of solution is related to its free energy through temperature and composition derivative. Hence, the Cloud point measurement can be used for accurate estimation of the temperature and composition dependence of parameters in the model for free energy. Hence, if we use a two pronged procedure in which we first use cloud point of solution to estimate some of the parameters of the thermodynamic model and determine the rest using osmotic coefficient data, we gain on two counts. First, since the parameters, estimated in each of the two steps, are fewer, we achieve higher accuracy of estimation. The second and more important gain is that the resulting model parameters are more sensitive to temperature. This is crucial when we wish to use the model outside temperatures window within which the parameter estimation is sought. The focus of the present work is to prove this proposition. We have used electrolyte (NaCl/Na2CO3)-water-organic molecule (Iso-propanol/ethanol) as the model system. The model of Robinson-Stokes-Glukauf is modified by incorporating the temperature dependent Flory-Huggins interaction parameters. The Helmholtz free energy expression contains, in addition to electrostatic and translational entropic contributions, three Flory-Huggins pairwise interaction contributions viz., and (w-water, p-polymer, s-salt). These parameters depend both on temperature and concentrations. The concentration dependence is expressed in the form of a quadratic expression involving the volume fractions of the interacting species. The temperature dependence is expressed in the form .To obtain the temperature-dependent interaction parameters for organic molecule-water and electrolyte-water systems, Critical solution temperature of electrolyte -water-organic molecules is measured using cloud point measuring apparatus The temperature and composition dependent interaction parameters for electrolyte-water-organic molecule are estimated through measurement of cloud point of solution. The model is used to estimate critical solution temperature (CST) of electrolyte water-organic molecules solution. We have experimentally determined the critical solution temperature of different compositions of electrolyte-water-organic molecule solution and compared the results with the estimates based on our model. The two sets of values show good agreement. On the other hand when only osmotic coefficients are used for estimation of the free energy model, CST predicted using the resulting model show poor agreement with the experiments. Thus, the importance of the CST data in the estimation of parameters of the thermodynamic model is confirmed through this work.Keywords: concentrated electrolytes, Debye-Hückel theory, interaction parameters, Robinson-Stokes-Glueckauf model, Flory-Huggins model, critical solution temperature
Procedia PDF Downloads 391960 Cloud Points to Create an Innovative and Custom Ankle Foot Orthosis in CAD Environment
Authors: Y. Benabid, K. Benfriha, V. Rieuf, J. F. Omhover
Abstract:
This paper describes an approach to create custom concepts for innovative products; this approach describes relations between innovation tools and Computer Aided Design environment (use creativity session and design tools). A model for the design process is proposed and explored in order to describe the power tool used to create and ameliorate an innovative product all based upon a range of data (cloud points) in this study. Comparison between traditional method and innovative method we help to generate and put forward a new model of the design process in order to create a custom Ankle Foot Orthosis (AFO) in a CAD environment in order to ameliorate and controlling the motion. The custom concept needs big development in different environments; the relation between these environments is described. The results can help the surgeons in the upstream treatment phases. CAD models can be applied and accepted by professionals in the design and manufacture systems. This development is based on the anatomy of the population of North Africa.Keywords: ankle foot orthosis, CAD, reverse engineering, sketch
Procedia PDF Downloads 455959 A Variant of a Double Structure-Preserving QR Algorithm for Symmetric and Hamiltonian Matrices
Authors: Ahmed Salam, Haithem Benkahla
Abstract:
Recently, an efficient backward-stable algorithm for computing eigenvalues and vectors of a symmetric and Hamiltonian matrix has been proposed. The method preserves the symmetric and Hamiltonian structures of the original matrix, during the whole process. In this paper, we revisit the method. We derive a way for implementing the reduction of the matrix to the appropriate condensed form. Then, we construct a novel version of the implicit QR-algorithm for computing the eigenvalues and vectors.Keywords: block implicit QR algorithm, preservation of a double structure, QR algorithm, symmetric and Hamiltonian structures
Procedia PDF Downloads 409958 End-to-End Control and Management of Multi-AS Virtual Service Networks Using SDN and Autonomic Computing Architecture
Authors: Yong Xue, Daniel A. Menascé
Abstract:
Automated and end-to-end network resource management and provisioning for virtual service networks in a multiple autonomous systems (a.k.a multi-AS) environment is a challenging and open problem. This paper proposes a novel, scalable and interoperable high-level architecture that incorporates a number of emerging enabling technologies including Software Defined Network (SDN), Network Function Virtualization (NFV), Service Oriented Architecture (SOA), and Autonomic Computing. The proposed architecture can be used to not only automate network resource management and provisioning for virtual service networks across multiple autonomous substrate networks, but also provide an adaptive capability for achieving optimal network resource management and maintaining network-level end-to-end network performance as well. The paper argues that this SDN and autonomic computing based architecture lays a solid foundation that can facilitate the development of the future Internet based on the pluralistic paradigm.Keywords: virtual network, software defined network, virtual service network, adaptive resource management, SOA, multi-AS, inter-domain
Procedia PDF Downloads 531957 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm
Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra
Abstract:
With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction
Procedia PDF Downloads 124956 A Unified Approach for Digital Forensics Analysis
Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles
Abstract:
Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool
Procedia PDF Downloads 196955 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 107954 Risk Measure from Investment in Finance by Value at Risk
Authors: Mohammed El-Arbi Khalfallah, Mohamed Lakhdar Hadji
Abstract:
Managing and controlling risk is a topic research in the world of finance. Before a risky situation, the stakeholders need to do comparison according to the positions and actions, and financial institutions must take measures of a particular market risk and credit. In this work, we study a model of risk measure in finance: Value at Risk (VaR), which is a new tool for measuring an entity's exposure risk. We explain the concept of value at risk, your average, tail, and describe the three methods for computing: Parametric method, Historical method, and numerical method of Monte Carlo. Finally, we briefly describe advantages and disadvantages of the three methods for computing value at risk.Keywords: average value at risk, conditional value at risk, tail value at risk, value at risk
Procedia PDF Downloads 441953 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment
Authors: Ella Sèdé Maforikan
Abstract:
Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment
Procedia PDF Downloads 63952 Learning to Teach on the Cloud: Preservice EFL Teachers’ Online Project-Based Practicum Experience
Authors: Mei-Hui Liu
Abstract:
This paper reports 20 preservice EFL teachers’ learning-to-teach experience when they were engaged in an online project-based practicum implemented on a Cloud Platform. This 10-month study filled in the literature gap by documenting the impact of online project-based instruction on preservice EFL teachers’ professional development. Data analysis showed that the online practicum was regarded as a flexible mechanism offering chances of teaching practices without geographical barriers. Additionally, this project-based practice helped the participants integrate the theories they had learned and further foster them how to create a self-directed online learning environment. Furthermore, these preservice teachers with experiences of technology-enabled practicum showed their motivation to apply technology and online platforms into future instructional practices. Yet, this study uncovered several concerns encountered by these participants during this online field experience. The findings of this study rendered meaning and lessons for teacher educators intending to integrate online practicum into preservice training courses.Keywords: online teaching practicum, project-based learning, teacher preparation, English language education
Procedia PDF Downloads 371951 A Cloud-Based Federated Identity Management in Europe
Authors: Jesus Carretero, Mario Vasile, Guillermo Izquierdo, Javier Garcia-Blas
Abstract:
Currently, there is a so called ‘identity crisis’ in cybersecurity caused by the substantial security, privacy and usability shortcomings encountered in existing systems for identity management. Federated Identity Management (FIM) could be solution for this crisis, as it is a method that facilitates management of identity processes and policies among collaborating entities without enforcing a global consistency, that is difficult to achieve when there are ID legacy systems. To cope with this problem, the Connecting Europe Facility (CEF) initiative proposed in 2014 a federated solution in anticipation of the adoption of the Regulation (EU) N°910/2014, the so-called eIDAS Regulation. At present, a network of eIDAS Nodes is being deployed at European level to allow that every citizen recognized by a member state is to be recognized within the trust network at European level, enabling the consumption of services in other member states that, until now were not allowed, or whose concession was tedious. This is a very ambitious approach, since it tends to enable cross-border authentication of Member States citizens without the need to unify the authentication method (eID Scheme) of the member state in question. However, this federation is currently managed by member states and it is initially applied only to citizens and public organizations. The goal of this paper is to present the results of a European Project, named eID@Cloud, that focuses on the integration of eID in 5 cloud platforms belonging to authentication service providers of different EU Member States to act as Service Providers (SP) for private entities. We propose an initiative based on a private eID Scheme both for natural and legal persons. The methodology followed in the eID@Cloud project is that each Identity Provider (IdP) is subscribed to an eIDAS Node Connector, requesting for authentication, that is subscribed to an eIDAS Node Proxy Service, issuing authentication assertions. To cope with high loads, load balancing is supported in the eIDAS Node. The eID@Cloud project is still going on, but we already have some important outcomes. First, we have deployed the federation identity nodes and tested it from the security and performance point of view. The pilot prototype has shown the feasibility of deploying this kind of systems, ensuring good performance due to the replication of the eIDAS nodes and the load balance mechanism. Second, our solution avoids the propagation of identity data out of the native domain of the user or entity being identified, which avoids problems well known in cybersecurity due to network interception, man in the middle attack, etc. Last, but not least, this system allows to connect any country or collectivity easily, providing incremental development of the network and avoiding difficult political negotiations to agree on a single authentication format (which would be a major stopper).Keywords: cybersecurity, identity federation, trust, user authentication
Procedia PDF Downloads 166950 Exploration into Bio Inspired Computing Based on Spintronic Energy Efficiency Principles and Neuromorphic Speed Pathways
Authors: Anirudh Lahiri
Abstract:
Neuromorphic computing, inspired by the intricate operations of biological neural networks, offers a revolutionary approach to overcoming the limitations of traditional computing architectures. This research proposes the integration of spintronics with neuromorphic systems, aiming to enhance computational performance, scalability, and energy efficiency. Traditional computing systems, based on the Von Neumann architecture, struggle with scalability and efficiency due to the segregation of memory and processing functions. In contrast, the human brain exemplifies high efficiency and adaptability, processing vast amounts of information with minimal energy consumption. This project explores the use of spintronics, which utilizes the electron's spin rather than its charge, to create more energy-efficient computing systems. Spintronic devices, such as magnetic tunnel junctions (MTJs) manipulated through spin-transfer torque (STT) and spin-orbit torque (SOT), offer a promising pathway to reducing power consumption and enhancing the speed of data processing. The integration of these devices within a neuromorphic framework aims to replicate the efficiency and adaptability of biological systems. The research is structured into three phases: an exhaustive literature review to build a theoretical foundation, laboratory experiments to test and optimize the theoretical models, and iterative refinements based on experimental results to finalize the system. The initial phase focuses on understanding the current state of neuromorphic and spintronic technologies. The second phase involves practical experimentation with spintronic devices and the development of neuromorphic systems that mimic synaptic plasticity and other biological processes. The final phase focuses on refining the systems based on feedback from the testing phase and preparing the findings for publication. The expected contributions of this research are twofold. Firstly, it aims to significantly reduce the energy consumption of computational systems while maintaining or increasing processing speed, addressing a critical need in the field of computing. Secondly, it seeks to enhance the learning capabilities of neuromorphic systems, allowing them to adapt more dynamically to changing environmental inputs, thus better mimicking the human brain's functionality. The integration of spintronics with neuromorphic computing could revolutionize how computational systems are designed, making them more efficient, faster, and more adaptable. This research aligns with the ongoing pursuit of energy-efficient and scalable computing solutions, marking a significant step forward in the field of computational technology.Keywords: material science, biological engineering, mechanical engineering, neuromorphic computing, spintronics, energy efficiency, computational scalability, synaptic plasticity.
Procedia PDF Downloads 43949 Terrestrial Laser Scans to Assess Aerial LiDAR Data
Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani
Abstract:
The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy
Procedia PDF Downloads 100948 Regression Approach for Optimal Purchase of Hosts Cluster in Fixed Fund for Hadoop Big Data Platform
Authors: Haitao Yang, Jianming Lv, Fei Xu, Xintong Wang, Yilin Huang, Lanting Xia, Xuewu Zhu
Abstract:
Given a fixed fund, purchasing fewer hosts of higher capability or inversely more of lower capability is a must-be-made trade-off in practices for building a Hadoop big data platform. An exploratory study is presented for a Housing Big Data Platform project (HBDP), where typical big data computing is with SQL queries of aggregate, join, and space-time condition selections executed upon massive data from more than 10 million housing units. In HBDP, an empirical formula was introduced to predict the performance of host clusters potential for the intended typical big data computing, and it was shaped via a regression approach. With this empirical formula, it is easy to suggest an optimal cluster configuration. The investigation was based on a typical Hadoop computing ecosystem HDFS+Hive+Spark. A proper metric was raised to measure the performance of Hadoop clusters in HBDP, which was tested and compared with its predicted counterpart, on executing three kinds of typical SQL query tasks. Tests were conducted with respect to factors of CPU benchmark, memory size, virtual host division, and the number of element physical host in cluster. The research has been applied to practical cluster procurement for housing big data computing.Keywords: Hadoop platform planning, optimal cluster scheme at fixed-fund, performance predicting formula, typical SQL query tasks
Procedia PDF Downloads 232947 mm-Wave Wearable Edge Computing Module Hosted by Printed Ridge Gap Waveguide Structures: A Physical Layer Study
Authors: Matthew Kostawich, Mohammed Elmorsy, Mohamed Sayed Sifat, Shoukry Shams, Mahmoud Elsaadany
Abstract:
6G communication systems represent the nominal future extension of current wireless technology, where its impact is extended to touch upon all human activities, including medical, security, and entertainment applications. As a result, human needs are allocated among the highest priority aspects of the system design and requirements. 6G communications is expected to replace all the current video conferencing with interactive virtual reality meetings involving high data-rate transmission merged with massive distributed computing resources. In addition, the current expansion of IoT applications must be mitigated with significant network changes to provide a reasonable Quality of Service (QoS). This directly implies a high demand for Human-Computer Interaction (HCI) through mobile computing modules in future wireless communication systems. This article proposes the utilization of a Printed Ridge Gap Waveguide (PRGW) to host the wearable nodes. To the best of our knowledge, we propose for the first time a physical layer analysis within the context of a complete architecture. A thorough study is provided on the impact of the distortion of the guiding structure on the overall system performance. The proposed structure shows small latency and small losses, highlighting its compatibility with future applications.Keywords: ridge gap waveguide, edge computing module, 6G, multimedia IoT applications
Procedia PDF Downloads 71946 An Exploratory Study Applied to Search Relationship between Humans and Universe
Authors: Mohamed Hashelaf, Ahmed Al-Osdody
Abstract:
In this paper, we focused our efforts on one of the vaguest subjects in astrophysics that is the formation and evolution of the universe until the arrival of humans. Through an in-depth exploration of the origins of the universe, understanding what has happened since the Big Bang until now and checking the history of creation, we can answer questions about the future of life, the possibility of its existence elsewhere in the universe and to be able to understand how we came, what our role in the circle of life is and what the future of our development will be. Here is where we used systematic steps that allowed us first and foremost to identify the reason behind the big bang itself that formed a large cloud of cosmic dust. Then after a period of time from the expansion of the universe and its coolness, the initial molecules of gases from the cosmic cloud began to condense, forming a very dense field of gravity that after millions of years led to the formation of stars, galaxies, even earth and the else planets. Finally, it became clear before us that after the earth has formed, the existence of liquid water made it possible for life to form, starting from the bacteria all the way until the appearance of the humans that we know today. But it does not stop here. If we look and contemplate in ourselves as humans, we will understand that the universe is inside us and that’s what makes us exceptional. All of this means that just as life on earth was created, it could have been on other planets as well. It also means that we are the universe’s key to understand itself.Keywords: Big Bang, cosmic dust, primary elements, universe
Procedia PDF Downloads 134945 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life
Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar
Abstract:
In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home
Procedia PDF Downloads 113944 Image Encryption Using Eureqa to Generate an Automated Mathematical Key
Authors: Halima Adel Halim Shnishah, David Mulvaney
Abstract:
Applying traditional symmetric cryptography algorithms while computing encryption and decryption provides immunity to secret keys against different attacks. One of the popular techniques generating automated secret keys is evolutionary computing by using Eureqa API tool, which got attention in 2013. In this paper, we are generating automated secret keys for image encryption and decryption using Eureqa API (tool which is used in evolutionary computing technique). Eureqa API models pseudo-random input data obtained from a suitable source to generate secret keys. The validation of generated secret keys is investigated by performing various statistical tests (histogram, chi-square, correlation of two adjacent pixels, correlation between original and encrypted images, entropy and key sensitivity). Experimental results obtained from methods including histogram analysis, correlation coefficient, entropy and key sensitivity, show that the proposed image encryption algorithms are secure and reliable, with the potential to be adapted for secure image communication applications.Keywords: image encryption algorithms, Eureqa, statistical measurements, automated key generation
Procedia PDF Downloads 483943 Intelligent Computing with Bayesian Regularization Artificial Neural Networks for a Nonlinear System of COVID-19 Epidemic Model for Future Generation Disease Control
Authors: Tahir Nawaz Cheema, Dumitru Baleanu, Ali Raza
Abstract:
In this research work, we design intelligent computing through Bayesian Regularization artificial neural networks (BRANNs) introduced to solve the mathematical modeling of infectious diseases (Covid-19). The dynamical transmission is due to the interaction of people and its mathematical representation based on the system's nonlinear differential equations. The generation of the dataset of the Covid-19 model is exploited by the power of the explicit Runge Kutta method for different countries of the world like India, Pakistan, Italy, and many more. The generated dataset is approximately used for training, testing, and validation processes for every frequent update in Bayesian Regularization backpropagation for numerical behavior of the dynamics of the Covid-19 model. The performance and effectiveness of designed methodology BRANNs are checked through mean squared error, error histograms, numerical solutions, absolute error, and regression analysis.Keywords: mathematical models, beysian regularization, bayesian-regularization backpropagation networks, regression analysis, numerical computing
Procedia PDF Downloads 147942 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 63941 Large Eddy Simulation of Particle Clouds Using Open-Source CFD
Authors: Ruo-Qian Wang
Abstract:
Open-source CFD has become increasingly popular and promising. The recent progress in multiphase flow enables new CFD applications, which provides an economic and flexible research tool for complex flow problems. Our numerical study using four-way coupling Euler-Lagrangian Large-Eddy Simulations to resolve particle cloud dynamics with OpenFOAM and CFDEM will be introduced: The fractioned Navier-Stokes equations are numerically solved for fluid phase motion, solid phase motion is addressed by Lagrangian tracking for every single particle, and total momentum is conserved by fluid-solid inter-phase coupling. The grid convergence test was performed, which proves the current resolution of the mesh is appropriate. Then, we validated the code by comparing numerical results with experiments in terms of particle cloud settlement and growth. A good comparison was obtained showing reliability of the present numerical schemes. The time and height at phase separations were defined and analyzed for a variety of initial release conditions. Empirical formulas were drawn to fit the results.Keywords: four-way coupling, dredging, land reclamation, multiphase flows, oil spill
Procedia PDF Downloads 429940 Computing the Similarity and the Diversity in the Species Based on Cronobacter Genome
Authors: E. Al Daoud
Abstract:
The purpose of computing the similarity and the diversity in the species is to trace the process of evolution and to find the relationship between the species and discover the unique, the special, the common and the universal proteins. The proteins of the whole genome of 40 species are compared with the cronobacter genome which is used as reference genome. More than 3 billion pairwise alignments are performed using blastp. Several findings are introduced in this study, for example, we found 172 proteins in cronobacter genome which have insignificant hits in other species, 116 significant proteins in the all tested species with very high score value and 129 common proteins in the plants but have insignificant hits in mammals, birds, fishes, and insects.Keywords: genome, species, blastp, conserved genes, Cronobacter
Procedia PDF Downloads 496939 Particle Observation in Secondary School Using a Student-Built Instrument: Design-Based Research on a STEM Sequence about Particle Physics
Authors: J.Pozuelo-Muñoz, E. Cascarosa-Salillas, C. Rodríguez-Casals, A. de Echave, E. Terrado-Sieso
Abstract:
This study focuses on the development, implementation, and evaluation of an instructional sequence aimed at 16–17-year-old students, involving the design and use of a cloud chamber—a device that allows observation of subatomic particles. The research addresses the limited presence of particle physics in Spanish secondary and high school curricula, a gap that restricts students' learning of advanced physics concepts and diminishes engagement with complex scientific topics. The primary goal of this project is to introduce particle physics in the classroom through a practical, interdisciplinary methodology that promotes autonomous learning and critical thinking. The methodology is framed within Design-Based Research (DBR), an approach that enables iterative and pragmatic development of educational resources. The research proceeded in several phases, beginning with the design of an experimental teaching sequence, followed by its implementation in high school classrooms. This sequence was evaluated, redesigned, and reimplemented with the aim of enhancing students’ understanding and skills related to designing and using particle detection instruments. The instructional sequence was divided into four stages: introduction to the activity, research and design of cloud chamber prototypes, observation of particle tracks, and analysis of collected data. In the initial stage, students were introduced to the fundamentals of the activity and provided with bibliographic resources to conduct autonomous research on cloud chamber functioning principles. During the design stage, students sourced materials and constructed their own prototypes, stimulating creativity and understanding of physics concepts like thermodynamics and material properties. The third stage focused on observing subatomic particles, where students recorded and analyzed the tracks generated in their chambers. Finally, critical reflection was encouraged regarding the instrument's operation and the nature of the particles observed. The results show that designing the cloud chamber motivates students and actively engages them in the learning process. Additionally, the use of this device introduces advanced scientific topics beyond particle physics, promoting a broader understanding of science. The study’s conclusions emphasize the need to provide students with ample time and space to thoroughly understand the role of materials and physical conditions in the functioning of their prototypes and to encourage critical analysis of the obtained data. This project not only highlights the importance of interdisciplinarity in science education but also provides a practical framework for teachers to adapt complex concepts for educational contexts where these topics are often absent.Keywords: cloud chamber, particle physics, secondary education, instructional design, design-based research, STEM
Procedia PDF Downloads 13938 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection
Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa
Abstract:
Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.Keywords: classification, airborne LiDAR, parameters selection, support vector machine
Procedia PDF Downloads 147937 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing
Authors: Tolulope Aremu
Abstract:
This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving
Procedia PDF Downloads 31936 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes
Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar
Abstract:
Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.Keywords: continuous query processing, dynamic database, moving object, skyline queries
Procedia PDF Downloads 210935 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison
Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo
Abstract:
A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.Keywords: affective computing, interface, brain, intelligent interaction
Procedia PDF Downloads 388934 Blended Cloud Based Learning Approach in Information Technology Skills Training and Paperless Assessment: Case Study of University of Cape Coast
Authors: David Ofosu-Hamilton, John K. E. Edumadze
Abstract:
Universities have come to recognize the role Information and Communication Technology (ICT) skills plays in the daily activities of tertiary students. The ability to use ICT – essentially, computers and their diverse applications – are important resources that influence an individual’s economic and social participation and human capital development. Our society now increasingly relies on the Internet, and the Cloud as a means to communicate and disseminate information. The educated individual should, therefore, be able to use ICT to create and share knowledge that will improve society. It is, therefore, important that universities require incoming students to demonstrate a level of computer proficiency or trained to do so at a minimal cost by deploying advanced educational technologies. The training and standardized assessment of all in-coming first-year students of the University of Cape Coast in Information Technology Skills (ITS) have become a necessity as students’ most often than not highly overestimate their digital skill and digital ignorance is costly to any economy. The one-semester course is targeted at fresh students and aimed at enhancing the productivity and software skills of students. In this respect, emphasis is placed on skills that will enable students to be proficient in using Microsoft Office and Google Apps for Education for their academic work and future professional work whiles using emerging digital multimedia technologies in a safe, ethical, responsible, and legal manner. The course is delivered in blended mode - online and self-paced (student centered) using Alison’s free cloud-based tutorial (Moodle) of Microsoft Office videos. Online support is provided via discussion forums on the University’s Moodle platform and tutor-directed and assisted at the ICT Centre and Google E-learning laboratory. All students are required to register for the ITS course during either the first or second semester of the first year and must participate and complete it within a semester. Assessment focuses on Alison online assessment on Microsoft Office, Alison online assessment on ALISON ABC IT, Peer assessment on e-portfolio created using Google Apps/Office 365 and an End of Semester’s online assessment at the ICT Centre whenever the student was ready in the cause of the semester. This paper, therefore, focuses on the digital culture approach of hybrid teaching, learning and paperless examinations and the possible adoption by other courses or programs at the University of Cape Coast.Keywords: assessment, blended, cloud, paperless
Procedia PDF Downloads 248933 Process Mining as an Ecosystem Platform to Mitigate a Deficiency of Processes Modelling
Authors: Yusra Abdulsalam Alqamati, Ahmed Alkilany
Abstract:
The teaching staff is a distinct group whose impact is on the educational process and which plays an important role in enhancing the quality of the academic education process. To improve the management effectiveness of the academy, the Teaching Staff Management System (TSMS) proposes that all teacher processes be digitized. Since the BPMN approach can accurately describe the processes, it lacks a clear picture of the process flow map, something that the process mining approach has, which is extracting information from event logs for discovery, monitoring, and model enhancement. Therefore, these two methodologies were combined to create the most accurate representation of system operations, the ability to extract data records and mining processes, recreate them in the form of a Petri net, and then generate them in a BPMN model for a more in-depth view of process flow. Additionally, the TSMS processes will be orchestrated to handle all requests in a guaranteed small-time manner thanks to the integration of the Google Cloud Platform (GCP), the BPM engine, and allowing business owners to take part throughout the entire TSMS project development lifecycle.Keywords: process mining, BPM, business process model and notation, Petri net, teaching staff, Google Cloud Platform
Procedia PDF Downloads 141932 Robot Spatial Reasoning via 3D Models
Authors: John Allard, Alex Rich, Iris Aguilar, Zachary Dodds
Abstract:
With this paper we present several experiences deploying novel, low-cost resources for computing with 3D spatial models. Certainly, computing with 3D models undergirds some of our field’s most important contributions to the human experience. Most often, those are contrived artifacts. This work extends that tradition by focusing on novel resources that deliver uncontrived models of a system’s current surroundings. Atop this new capability, we present several projects investigating the student-accessibility of the computational tools for reasoning about the 3D space around us. We conclude that, with current scaffolding, real-world 3D models are now an accessible and viable foundation for creative computational work.Keywords: 3D vision, matterport model, real-world 3D models, mathematical and computational methods
Procedia PDF Downloads 536