Search results for: granular computing
778 An Elasto-Viscoplastic Constitutive Model for Unsaturated Soils: Numerical Implementation and Validation
Authors: Maria Lazari, Lorenzo Sanavia
Abstract:
Mechanics of unsaturated soils has been an active field of research in the last decades. Efficient constitutive models that take into account the partial saturation of soil are necessary to solve a number of engineering problems e.g. instability of slopes and cuts due to heavy rainfalls. A large number of constitutive models can now be found in the literature that considers fundamental issues associated with the unsaturated soil behaviour, like the volume change and shear strength behaviour with suction or saturation changes. Partially saturated soils may either expand or collapse upon wetting depending on the stress level, and it is also possible that a soil might experience a reversal in the volumetric behaviour during wetting. Shear strength of soils also changes dramatically with changes in the degree of saturation, and a related engineering problem is slope failures caused by rainfall. There are several states of the art reviews over the last years for studying the topic, usually providing a thorough discussion of the stress state, the advantages, and disadvantages of specific constitutive models as well as the latest developments in the area of unsaturated soil modelling. However, only a few studies focused on the coupling between partial saturation states and time effects on the behaviour of geomaterials. Rate dependency is experimentally observed in the mechanical response of granular materials, and a viscoplastic constitutive model is capable of reproducing creep and relaxation processes. Therefore, in this work an elasto-viscoplastic constitutive model for unsaturated soils is proposed and validated on the basis of experimental data. The model constitutes an extension of an existing elastoplastic strain-hardening constitutive model capable of capturing the behaviour of variably saturated soils, based on energy conjugated stress variables in the framework of superposed continua. The purpose was to develop a model able to deal with possible mechanical instabilities within a consistent energy framework. The model shares the same conceptual structure of the elastoplastic laws proposed to deal with bonded geomaterials subject to weathering or diagenesis and is capable of modelling several kinds of instabilities induced by the loss of hydraulic bonding contributions. The novelty of the proposed formulation is enhanced with the incorporation of density dependent stiffness and hardening coefficients in order to allow the modeling of the pycnotropy behaviour of granular materials with a single set of material constants. The model has been implemented in the commercial FE platform PLAXIS, widely used in Europe for advanced geotechnical design. The algorithmic strategies adopted for the stress-point algorithm had to be revised to take into account the different approach adopted by PLAXIS developers in the solution of the discrete non-linear equilibrium equations. An extensive comparison between models with a series of experimental data reported by different authors is presented to validate the model and illustrate the capability of the newly developed model. After the validation, the effectiveness of the viscoplastic model is displayed by numerical simulations of a partially saturated slope failure of the laboratory scale and the effect of viscosity and degree of saturation on slope’s stability is discussed.Keywords: PLAXIS software, slope, unsaturated soils, Viscoplasticity
Procedia PDF Downloads 224777 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing
Procedia PDF Downloads 428776 Integrated Teaching of Hardware Courses for the Undergraduates of Computer Science and Engineering to Attain Focused Outcomes
Authors: Namrata D. Hiremath, Mahalaxmi Bhille, P. G. Sunitha Hiremath
Abstract:
Computer systems play an integral role in all facets of the engineering profession. This calls for an understanding of the processor-level components of computer systems, their design and operation, and their impact on the overall performance of the systems. Systems users are always in need of faster, more powerful, yet cheaper computer systems. The focus of Computer Science engineering graduates is inclined towards software oriented base. To be an efficient programmer there is a need to understand the role of hardware architecture towards the same. It is essential for the students of Computer Science and Engineering to know the basic building blocks of any computing device and how the digital principles can be used to build them. Hence two courses Digital Electronics of 3 credits, which is associated with lab of 1.5 credits and Computer Organization of 5 credits, were introduced at the sophomore level. Activity was introduced with the objective to teach the hardware concepts to the students of Computer science engineering through structured lab. The students were asked to design and implement a component of a computing device using MultiSim simulation tool and build the same using hardware components. The experience of the activity helped the students to understand the real time applications of the SSI and MSI components. The impact of the activity was evaluated and the performance was measured. The paper explains the achievement of the ABET outcomes a, c and k.Keywords: digital, computer organization, ABET, structured enquiry, course activity
Procedia PDF Downloads 500775 Design of an Ensemble Learning Behavior Anomaly Detection Framework
Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia
Abstract:
Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.Keywords: cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing
Procedia PDF Downloads 128774 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 270773 Iron Removal from Aqueous Solutions by Fabricated Calcite Ooids
Authors: Al-Sayed A. Bakr, W. A. Makled
Abstract:
The precipitated low magnesium calcite ooids in assembled softening unit from natural Mediterranean seawater samples were used as adsorbent media in a comparative study with granular activated carbon media in a two separated single-media filtration vessels (operating in parallel) for removal of iron from aqueous solutions. In each vessel, the maximum bed capacity, which required to be filled, was 13.2 l and the bed filled in the vessels of ooids and GAC were 8.6, and 6.6 l, respectively. The operating conditions applied to the semi-pilot filtration unit were constant pH (7.5), different temperatures (293, 303 and 313 k), different flow rates (20, 30, 40, 50 and 60 l/min), different initial Fe(II) concentrations (15–105 mg/ l) and the calculated adsorbent masses were 34.1 and 123 g/l for GAC and calcite ooids, respectively. At higher temperature (313 k) and higher flow rate (60 l/min), the maximum adsorption capacities for ferrous ions by GAC and calcite ooids filters were 3.87 and 1.29 mg/g and at lower flow rate (20 l/min), the maximum adsorption capacities were 2.21 and 3.95 mg/g, respectively. From the experimental data, Freundlich and Langmuir adsorption isotherms were used to verify the adsorption performance. Therefore, the calcite ooids could act as new highly effective materials in iron removal from aqueous solutions.Keywords: water treatment, calcite ooids, activated carbon, Fe(II) removal, filtration
Procedia PDF Downloads 152772 An Integrated Fuzzy Inference System and Technique for Order of Preference by Similarity to Ideal Solution Approach for Evaluation of Lean Healthcare Systems
Authors: Aydin M. Torkabadi, Ehsan Pourjavad
Abstract:
A decade after the introduction of Lean in Saskatchewan’s public healthcare system, its effectiveness remains a controversial subject among health researchers, workers, managers, and politicians. Therefore, developing a framework to quantitatively assess the Lean achievements is significant. This study investigates the success of initiatives across Saskatchewan health regions by recognizing the Lean healthcare criteria, measuring the success levels, comparing the regions, and identifying the areas for improvements. This study proposes an integrated intelligent computing approach by applying Fuzzy Inference System (FIS) and Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). FIS is used as an efficient approach to assess the Lean healthcare criteria, and TOPSIS is applied for ranking the values in regards to the level of leanness. Due to the innate uncertainty in decision maker judgments on criteria, principals of the fuzzy theory are applied. Finally, FIS-TOPSIS was established as an efficient technique in determining the lean merit in healthcare systems.Keywords: lean healthcare, intelligent computing, fuzzy inference system, healthcare evaluation, technique for order of preference by similarity to ideal solution, multi-criteria decision making, MCDM
Procedia PDF Downloads 162771 Effect of the Cross-Sectional Geometry on Heat Transfer and Particle Motion of Circulating Fluidized Bed Riser for CO2 Capture
Authors: Seungyeong Choi, Namkyu Lee, Dong Il Shim, Young Mun Lee, Yong-Ki Park, Hyung Hee Cho
Abstract:
Effect of the cross-sectional geometry on heat transfer and particle motion of circulating fluidized bed riser for CO2 capture was investigated. Numerical simulation using Eulerian-eulerian method with kinetic theory of granular flow was adopted to analyze gas-solid flow consisting in circulating fluidized bed riser. Circular, square, and rectangular cross-sectional geometry cases of the same area were carried out. Rectangular cross-sectional geometries were analyzed having aspect ratios of 1: 2, 1: 4, 1: 8, and 1:16. The cross-sectional geometry significantly influenced the particle motion and heat transfer. The downward flow pattern of solid particles near the wall was changed. The gas-solid mixing degree of the riser with the rectangular cross section of the high aspect ratio was the lowest. There were differences in bed-to-wall heat transfer coefficient according to rectangular geometry with different aspect ratios.Keywords: bed geometry, computational fluid dynamics, circulating fluidized bed riser, heat transfer
Procedia PDF Downloads 260770 CFD Study on the Effect of Primary Air on Combustion of Simulated MSW Process in the Fixed Bed
Authors: Rui Sun, Tamer M. Ismail, Xiaohan Ren, M. Abd El-Salam
Abstract:
Incineration of municipal solid waste (MSW) is one of the key scopes in the global clean energy strategy. A computational fluid dynamics (CFD) model was established. In order to reveal these features of the combustion process in a fixed porous bed of MSW. Transporting equations and process rate equations of the waste bed were modeled and set up to describe the incineration process, according to the local thermal conditions and waste property characters. Gas phase turbulence was modeled using k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The heterogeneous reaction rates were determined using Arrhenius eddy dissipation and the Arrhenius-diffusion reaction rates. The effects of primary air flow rate and temperature in the burning process of simulated MSW are investigated experimentally and numerically. The simulation results in bed are accordant with experimental data well. The model provides detailed information on burning processes in the fixed bed, which is otherwise very difficult to obtain by conventional experimental techniques.Keywords: computational fluid dynamics (CFD) model, waste incineration, municipal solid waste (MSW), fixed bed, primary air
Procedia PDF Downloads 402769 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors
Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal
Abstract:
Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance
Procedia PDF Downloads 401768 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 17767 A Convergent Interacting Particle Method for Computing Kpp Front Speeds in Random Flows
Authors: Tan Zhang, Zhongjian Wang, Jack Xin, Zhiwen Zhang
Abstract:
We aim to efficiently compute the spreading speeds of reaction-diffusion-advection (RDA) fronts in divergence-free random flows under the Kolmogorov-Petrovsky-Piskunov (KPP) nonlinearity. We study a stochastic interacting particle method (IPM) for the reduced principal eigenvalue (Lyapunov exponent) problem of an associated linear advection-diffusion operator with spatially random coefficients. The Fourier representation of the random advection field and the Feynman-Kac (FK) formula of the principal eigenvalue (Lyapunov exponent) form the foundation of our method implemented as a genetic evolution algorithm. The particles undergo advection-diffusion and mutation/selection through a fitness function originated in the FK semigroup. We analyze the convergence of the algorithm based on operator splitting and present numerical results on representative flows such as 2D cellular flow and 3D Arnold-Beltrami-Childress (ABC) flow under random perturbations. The 2D examples serve as a consistency check with semi-Lagrangian computation. The 3D results demonstrate that IPM, being mesh-free and self-adaptive, is simple to implement and efficient for computing front spreading speeds in the advection-dominated regime for high-dimensional random flows on unbounded domains where no truncation is needed.Keywords: KPP front speeds, random flows, Feynman-Kac semigroups, interacting particle method, convergence analysis
Procedia PDF Downloads 46766 The Adsorption of Perfluorooctanoic Acid on Coconut Shell Activated Carbons
Authors: Premrudee Kanchanapiya, Supachai Songngam, Thanapol Tantisattayakul
Abstract:
Perfluorooctanoic acid (PFOA) is one of per- and polyfluoroalkyl substances (PFAS) that have increasingly attracted concerns due to their global distribution in environment, persistence, high bioaccumulation, and toxicity. It is important to study the effective treatment to remove PFOA from contaminated water. The feasibility of using commercial coconut shell activated carbon produced in Thailand to remove PFOA from water was investigated with regard to their adsorption kinetics and isotherms of powder activated carbon (PAC-325) and granular activated carbon (GAC-20x50). Adsorption kinetic results show that the adsorbent size significantly affected the adsorption rate of PFOA, and GAC-20x50 required at least 100 h to achieve the equilibrium, much longer than 3 h for PAC-325. Two kinetic models were fitted to the experimental data, and the pseudo-second-order model well described the adsorption of PFOA on both PAC-325 and GAC-20x50. PAC-325 trended to adsorb PFOA faster than GAC-20x50, and testing with the shortest adsorption times (5 min) still yielded substantial PFOA removal (~80% for PAC-325). The adsorption isotherms show that the adsorption capacity of PAC-325 was 0.80 mmol/g, which is 83 % higher than that for GAC-20x50 (0.13 mmol/g), according to the Langmuir fitting.Keywords: perfluorooctanoic acid, PFOA, coconut shell activated carbons, adsorption, water treatment
Procedia PDF Downloads 143765 Effective Nutrition Label Use on Smartphones
Authors: Vladimir Kulyukin, Tanwir Zaman, Sarat Kiran Andhavarapu
Abstract:
Research on nutrition label use identifies four factors that impede comprehension and retention of nutrition information by consumers: label’s location on the package, presentation of information within the label, label’s surface size, and surrounding visual clutter. In this paper, a system is presented that makes nutrition label use more effective for nutrition information comprehension and retention. The system’s front end is a smartphone application. The system’s back end is a four node Linux cluster for image recognition and data storage. Image frames captured on the smartphone are sent to the back end for skewed or aligned barcode recognition. When barcodes are recognized, corresponding nutrition labels are retrieved from a cloud database and presented to the user on the smartphone’s touchscreen. Each displayed nutrition label is positioned centrally on the touchscreen with no surrounding visual clutter. Wikipedia links to important nutrition terms are embedded to improve comprehension and retention of nutrition information. Standard touch gestures (e.g., zoom in/out) available on mainstream smartphones are used to manipulate the label’s surface size. The nutrition label database currently includes 200,000 nutrition labels compiled from public web sites by a custom crawler. Stress test experiments with the node cluster are presented. Implications for proactive nutrition management and food policy are discussed.Keywords: mobile computing, cloud computing, nutrition label use, nutrition management, barcode scanning
Procedia PDF Downloads 373764 Artificially Intelligent Context Aware Personal Computer Assistant (ACPCA)
Authors: Abdul Mannan Akhtar
Abstract:
In this paper a novel concept of a self learning smart personalized computer assistant (ACPCA) is established which is a context aware system. Based on user habits, moods, and other routines/situational reactions the system will manage various services and suggestions at appropriate times including what schedule to follow, what to watch, what software to be used, what should be deleted etc. This system will utilize a hybrid fuzzyNeural model to predict what the user will do next and support his actions. This will be done by establishing fuzzy sets of user activities, choices, preferences etc. and utilizing their combinations to predict his moods and immediate preferences. Various application of context aware systems exist separately e.g. on certain websites for music or multimedia suggestions but a personalized autonomous system that could adapt to user’s personality does not exist at present. Due to the novelty and massiveness of this concept, this paper will primarily focus on the problem establishment, product features and its functionality; however a small mini case is also implemented on MATLAB to demonstrate some of the aspects of ACPCA. The mini case involves prediction of user moods, activity, routine and food preference using a hybrid fuzzy-Neural soft computing technique.Keywords: context aware systems, APCPCA, soft computing techniques, artificial intelligence, fuzzy logic, neural network, mood detection, face detection, activity detection
Procedia PDF Downloads 464763 Context-Aware Alert Method in Hajj Pilgrim Location-Based Tracking System
Authors: Syarif Hidayat
Abstract:
As millions of people with different backgrounds perform hajj every year in Saudi Arabia, it brings out several problems. Missing people is among many crucial problems need to be encountered. Some people might have had insufficient knowledge of using tracking system equipment. Other might become a victim of an accident, lose consciousness, or even died, prohibiting them to perform certain activity. For those reasons, people could not send proper SOS message. The major contribution of this paper is the application of the diverse alert method in pilgrims tracking system. It offers a simple yet robust solution to send SOS message by pilgrims during Hajj. Knowledge of context aware computing is assumed herein. This study presents four methods that could be utilized by pilgrims to send SOS. The first method is simple mobile application contains only a button. The second method is based on behavior analysis based off GPS location movement anomaly. The third method is by introducing pressing pattern to smartwatch physical button as a panic button. The fourth method is by identifying certain accelerometer pattern recognition as a sign of emergency situations. Presented method in this paper would be an important part of pilgrims tracking system. The discussion provided here includes easy to use design whilst maintaining tracking accuracy, privacy, and security of its users.Keywords: context aware computing, emergency alert system, GPS, hajj pilgrim tracking, location-based services
Procedia PDF Downloads 216762 Educational Knowledge Transfer in Indigenous Mexican Areas Using Cloud Computing
Authors: L. R. Valencia Pérez, J. M. Peña Aguilar, A. Lamadrid Álvarez, A. Pastrana Palma, H. F. Valencia Pérez, M. Vivanco Vargas
Abstract:
This work proposes a Cooperation-Competitive (Coopetitive) approach that allows coordinated work among the Secretary of Public Education (SEP), the Autonomous University of Querétaro (UAQ) and government funds from National Council for Science and Technology (CONACYT) or some other international organizations. To work on an overall knowledge transfer strategy with e-learning over the Cloud, where experts in junior high and high school education, working in multidisciplinary teams, perform analysis, evaluation, design, production, validation and knowledge transfer at large scale using a Cloud Computing platform. Allowing teachers and students to have all the information required to ensure a homologated nationally knowledge of topics such as mathematics, statistics, chemistry, history, ethics, civism, etc. This work will start with a pilot test in Spanish and initially in two regional dialects Otomí and Náhuatl. Otomí has more than 285,000 speaking indigenes in Queretaro and Mexico´s central region. Náhuatl is number one indigenous dialect spoken in Mexico with more than 1,550,000 indigenes. The phase one of the project takes into account negotiations with indigenous tribes from different regions, and the Information and Communication technologies to deliver the knowledge to the indigenous schools in their native dialect. The methodology includes the following main milestones: Identification of the indigenous areas where Otomí and Náhuatl are the spoken dialects, research with the SEP the location of actual indigenous schools, analysis and inventory or current schools conditions, negotiation with tribe chiefs, analysis of the technological communication requirements to reach the indigenous communities, identification and inventory of local teachers technology knowledge, selection of a pilot topic, analysis of actual student competence with traditional education system, identification of local translators, design of the e-learning platform, design of the multimedia resources and storage strategy for “Cloud Computing”, translation of the topic to both dialects, Indigenous teachers training, pilot test, course release, project follow up, analysis of student requirements for the new technological platform, definition of a new and improved proposal with greater reach in topics and regions. Importance of phase one of the project is multiple, it includes the proposal of a working technological scheme, focusing in the cultural impact in Mexico so that indigenous tribes can improve their knowledge about new forms of crop improvement, home storage technologies, proven home remedies for common diseases, ways of preparing foods containing major nutrients, disclose strengths and weaknesses of each region, communicating through cloud computing platforms offering regional products and opening communication spaces for inter-indigenous cultural exchange.Keywords: Mexicans indigenous tribes, education, knowledge transfer, cloud computing, otomi, Náhuatl, language
Procedia PDF Downloads 404761 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs
Authors: Taysir Soliman
Abstract:
One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph
Procedia PDF Downloads 189760 Solar-Powered Adsorption Cooling System: A Case Study on the Climatic Conditions of Al Minya
Authors: El-Sadek H. Nour El-deen, K. Harby
Abstract:
Energy saving and environment friendly applications are turning out to be one of the most important topics nowadays. In this work, a simulation analysis using TRNSYS software has been carried out to study the benefit of employing a solar adsorption cooling system under the climatic conditions of Al-Minya city, Egypt. A theoretical model was carried out on a two bed adsorption cooling system employing granular activated carbon-HFC-404A as working pair. Temporal and averaged history of solar collector, adsorbent beds, evaporator and condenser has been shown. System performance in terms of daily average cooling capacity and average coefficient of performance around the year has been investigated. The results showed that maximum yearly average coefficient of performance (COP) and cooling capacity are about 0.26 and 8 kW respectively. The maximum value of the both average cooling capacity and COP cyclic is directly proportional to the maximum solar radiation. The system performance was found to be increased with the average ambient temperature. Finally, the proposed solar powered adsorption cooling systems can be used effectively under Al-Minya climatic conditions.Keywords: adsorption, cooling, Egypt, environment, solar energy
Procedia PDF Downloads 160759 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management
Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang
Abstract:
Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.Keywords: construction supply chain management, BIM, data exchange, artificial intelligence
Procedia PDF Downloads 26758 Metal-Organic Chemical Vapor Deposition (MOCVD) Process Investigation for Co Thin Film as a TSV Alternative Seed Layer
Authors: Sajjad Esmaeili, Robert Krause, Lukas Gerlich, Alireza Mohammadian Kia, Benjamin Uhlig
Abstract:
This investigation aims to develop the feasible and qualitative process parameters for the thin films fabrication into ultra-large through-silicon-vias (TSVs) as vertical interconnections. The focus of the study is on TSV metallization and its challenges employing new materials for the purpose of rapid signal propagation in the microsystems technology. Cobalt metal-organic chemical vapor deposition (Co-MOCVD) process enables manufacturing an adhesive and excellent conformal ultra-thin film all the way through TSVs in comparison with the conventional non-conformal physical vapor deposition (PVD) process of copper (Cu) seed layer. Therefore, this process provides a Cu seed-free layer which is capable of direct Cu electrochemical deposition (Cu-ECD) on top of it. The main challenge of this metallization module is to achieve the proper alternative seed layer with less roughness, sheet resistance and granular organic contamination (e.g. carbon) which intensify the Co corrosion under the influence of Cu electrolyte.Keywords: Cobalt MOCVD, direct Cu electrochemical deposition (ECD), metallization technology, through-silicon-via (TSV)
Procedia PDF Downloads 157757 Effect of Bentonite on Shear Strength of Bushehr Calcareous Sand
Authors: Arash Poordana, Reza Ziaie Moayed
Abstract:
Calcareous sands are found most commonly in areas adjacent to crude oil and gas, and particularly around water. These types of soil have high compressibility due to high inter-granular porosity, irregularity, fragility, and especially crushing. Also, based on experience, it has been shown that the behavior of these types of soil is not similar to silica sand in loading. Since the destructive effects of cement on the environment are obvious, other alternatives such as bentonite are popular to be used. Bentonite has always been used commercially in civil engineering projects and according to its low hydraulic conductivity, it is used for landfills, cut-off walls, and nuclear wastelands. In the present study, unconfined compression tests in five ageing periods (1, 3, 7, 14, and 28 days) after mixing different percentages of bentonite (5%, 7.5% and 10%) with Bushehr calcareous sand were performed. The relative density considered for the specimens is 50%. Optimum water content was then added to each specimen accordingly (19%, 18.5%, and 17.5%). The sample preparation method was wet tamping and the specimens were compacted in five layers. It can be concluded from the results that as the bentonite content increases, the unconfined compression strength of the soil increases. Based on the obtained results, 3-day and 7-day ageing periods showed 30% and 50% increase in the shear strength of soil, respectively.Keywords: unconfined compression test, bentonite, Bushehr, calcareous sand
Procedia PDF Downloads 129756 Experimental Investigation on Utility and Suitability of Lateritic Soil as a Pavement Material
Authors: J. Hemanth, B. G. Shivaprakash, S. V. Dinesh
Abstract:
The locally available Lateritic soil in Dakshina Kanadda and Udupi districts are traditionally being used as building blocks for construction purpose but they do not meet the conventional requirements (L L ≤ 25% & P I ≤6%) and desired four days soaked CBR value to be used as a sub-base course material in pavements. In order to improve its properties to satisfy the Atterberg’s Limits, the soil is blended with sand, cement and quarry dust at various percentages and also to meet the CBR strength requirements, individual and combined gradation of various sized aggregates along with Laterite soil and other filler materials has been done for coarse graded granular sub-base materials (Grading II and Grading III). The effect of additives blended with lateritic soil and aggregates are studied in terms of Atterberg’s limits, compaction, California Bearing Ratio (CBR), and permeability. It has been observed that the addition of sand, cement and quarry dust are found to be effective in improving Atterberg’s limits, CBR values, and permeability values. The obtained CBR and permeability values of Grading III, and Grading II materials found to be sufficient to be used as sub-base course for low volume roads and high volume roads respectively.Keywords: lateritic soil, sand, quarry dust, gradation, sub-base course, permeability
Procedia PDF Downloads 318755 Effect of Silt Presence on Shear Strength Parameters of Unsaturated Sandy Soils
Authors: R. Ziaie Moayed, E. Khavaninzadeh, M. Ghorbani Tochaee
Abstract:
Direct shear test is widely used in soil mechanics experiment to determine the shear strength parameters of granular soils. For analysis of soil stability problems such as bearing capacity, slope stability and lateral pressure on soil retaining structures, the shear strength parameters must be known well. In the present study, shear strength parameters are determined in silty-sand mixtures. Direct shear tests are performed on 161 Firoozkooh sand with different silt content at a relative density of 70% in three vertical stress of 100, 150, and 200 kPa. Wet tamping method is used for soil sample preparation, and the results include diagrams of shear stress versus shear deformation and sample height changes against shear deformation. Accordingly, in different silt percent, the shear strength parameters of the soil such as internal friction angle and dilation angle are calculated and compared. According to the results, when the sample contains up to 10% silt, peak shear strength and internal friction angle have an upward trend. However, if the sample contains 10% to 50% of silt a downward trend is seen in peak shear strength and internal friction angle.Keywords: shear strength parameters, direct shear test, silty sand, shear stress, shear deformation
Procedia PDF Downloads 163754 Numerical Analysis of Geosynthetic-Encased Stone Columns under Laterally Loads
Authors: R. Ziaie Moayed, M. Hossein Zade
Abstract:
Out of all methods for ground improvement, stone column became more popular these days due to its simple construction and economic consideration. Installation of stone column especially in loose fine graded soil causes increasing in load bearing capacity and settlement reduction. Encased granular stone columns (EGCs) are commonly subjected to vertical load. However, they may also be subjected to significant amount of shear loading. In this study, three-dimensional finite element (FE) analyses were conducted to estimate the shear load capacity of EGCs in sandy soil. Two types of different cases, stone column and geosynthetic encased stone column were studied at different normal pressures varying from 15 kPa to 75 kPa. Also, the effect of diameter in two cases was considered. A close agreement between the experimental and numerical curves of shear stress - horizontal displacement trend line is observed. The obtained result showed that, by increasing the normal pressure and diameter of stone column, higher shear strength is mobilized by soil; however, in the case of encased stone column, increasing the diameter had more dominated effect in mobilized shear strength.Keywords: encased stone column, laterally load, ordinary stone column, validation
Procedia PDF Downloads 369753 Rice Bran Material Enrichment of Granulated Cane Brown Sugar to Increase Policosanol Contents
Authors: Monthana Weerawatanakorn, Hajime Tamaki, Yonathan Asikin, Koji Wada, Makoto Takahashi, Chi-Tang Ho, Min-Hsiung Pan
Abstract:
Rice bran and sugarcane are significant sources of wax containing policosanol (PC), the cholesterol-lowering nutraceutical available in the market. The processing of rice bran oil causes the loss of PC content into various waste products. Therefore, we hypothesise that defatted rice bran (DRB) as agricultural waste product and rice bran oil (RBO) retain a varying but significant amount of PC wax. Non-centrifugal cane sugar (NCS) or cane brown sugar has been consumed worldwide and possesses various health benefits. Since PC wax is mainly in the outer layer rinds of cane, PC contents of the granulated sugar are reduced due to the peeling step. The study aimed to increase PC contents of the granular brown sugar by adding wax extracted from DRB and RBO and to investigate the toxicity of the developed products. The results showed that the total PC contents including long chain aldehyde of products were increased to the maximum level of 147.97 mg/100 g and 40.14 mg/100 g for extracted wax and rice bran oil addition, respectively. PC content of RBO was found to be 96.93 mg/100 g. DRB is promising source of policosanol (6,044.7 mg/100 g). The 28-day toxicity evaluations of the developed sugar revealed no adverse effects on the liver, spleen or kidney.Keywords: enrichment, sugarcane, policosanol, defatted rice bran, wax
Procedia PDF Downloads 371752 An Adiabatic Quantum Optimization Approach for the Mixed Integer Nonlinear Programming Problem
Authors: Maxwell Henderson, Tristan Cook, Justin Chan Jin Le, Mark Hodson, YoungJung Chang, John Novak, Daniel Padilha, Nishan Kulatilaka, Ansu Bagchi, Sanjoy Ray, John Kelly
Abstract:
We present a method of using adiabatic quantum optimization (AQO) to solve a mixed integer nonlinear programming (MINLP) problem instance. The MINLP problem is a general form of a set of NP-hard optimization problems that are critical to many business applications. It requires optimizing a set of discrete and continuous variables with nonlinear and potentially nonconvex constraints. Obtaining an exact, optimal solution for MINLP problem instances of non-trivial size using classical computation methods is currently intractable. Current leading algorithms leverage heuristic and divide-and-conquer methods to determine approximate solutions. Creating more accurate and efficient algorithms is an active area of research. Quantum computing (QC) has several theoretical benefits compared to classical computing, through which QC algorithms could obtain MINLP solutions that are superior to current algorithms. AQO is a particular form of QC that could offer more near-term benefits compared to other forms of QC, as hardware development is in a more mature state and devices are currently commercially available from D-Wave Systems Inc. It is also designed for optimization problems: it uses an effect called quantum tunneling to explore all lowest points of an energy landscape where classical approaches could become stuck in local minima. Our work used a novel algorithm formulated for AQO to solve a special type of MINLP problem. The research focused on determining: 1) if the problem is possible to solve using AQO, 2) if it can be solved by current hardware, 3) what the currently achievable performance is, 4) what the performance will be on projected future hardware, and 5) when AQO is likely to provide a benefit over classical computing methods. Two different methods, integer range and 1-hot encoding, were investigated for transforming the MINLP problem instance constraints into a mathematical structure that can be embedded directly onto the current D-Wave architecture. For testing and validation a D-Wave 2X device was used, as well as QxBranch’s QxLib software library, which includes a QC simulator based on simulated annealing. Our results indicate that it is mathematically possible to formulate the MINLP problem for AQO, but that currently available hardware is unable to solve problems of useful size. Classical general-purpose simulated annealing is currently able to solve larger problem sizes, but does not scale well and such methods would likely be outperformed in the future by improved AQO hardware with higher qubit connectivity and lower temperatures. If larger AQO devices are able to show improvements that trend in this direction, commercially viable solutions to the MINLP for particular applications could be implemented on hardware projected to be available in 5-10 years. Continued investigation into optimal AQO hardware architectures and novel methods for embedding MINLP problem constraints on to those architectures is needed to realize those commercial benefits.Keywords: adiabatic quantum optimization, mixed integer nonlinear programming, quantum computing, NP-hard
Procedia PDF Downloads 525751 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods
Authors: Vinayak Bassi, Rajpreet Singh
Abstract:
Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing
Procedia PDF Downloads 161750 Enabling Cloud Adoption Based Secured Mobile Banking through Backend as a Service
Authors: P. S. Jagadeesh Kumar, S. Meenakshi Sundaram
Abstract:
With the increase of prevailing non-traditional rivalry, mobile banking experiences an ever changing commercial backdrop. Substantial customer demands have established to be more intricate as customers request more expediency and superintend over their banking services. To enterprise advance and modernization in mobile banking applications, it is gradually obligatory to deeply leapfrog the scuffle using business model transformation. The dramaturgical vicissitudes taking place in mobile banking entail advanced traditions to exploit security. By reforming and transforming older back office into integrated mobile banking applications, banks can engender a supple and nimble banking environment that can rapidly respond to new business requirements over cloud computing. Cloud computing is transfiguring ecosystems in numerous industries, and mobile banking is no exemption providing services innovation, greater flexibility to respond to improved security and enhanced business intelligence with less cost. Cloud technology offer secure deployment possibilities that can provision banks in developing new customer experiences, empower operative relationship and advance speed to efficient banking transaction. Cloud adoption is escalating quickly since it can be made secured for commercial mobile banking transaction through backend as a service in scrutinizing the security strategies of the cloud service provider along with the antiquity of transaction details and their security related practices.Keywords: cloud adoption, backend as a service, business intelligence, secured mobile banking
Procedia PDF Downloads 254749 Discrete Element Method Simulation of Crushable Pumice Sand
Authors: Sayed Hessam Bahmani, Rolsndo P. Orense
Abstract:
From an engineering point of view, pumice particles are problematic because of their crushability and compressibility due to their vesicular nature. Currently, information on the geotechnical characteristics of pumice sands is limited. While extensive empirical and laboratory tests can be implemented to characterize their behavior, these are generally time-consuming and expensive. These drawbacks have motivated attempts to study the effects of particle breakage of pumice sand through the Discrete Element Method (DEM). This method provides insights into the behavior of crushable granular material at both the micro and macro-level. In this paper, the results of single-particle crushing tests conducted in the laboratory are simulated using DEM through the open-source code YADE. This is done to better understand the parameters necessary to represent the pumice microstructure that governs its crushing features, and to examine how the resulting microstructure evolution affects a particle’s properties. The DEM particle model is then used to simulate the behavior of pumice sand during consolidated drained triaxial tests. The results indicate the importance of incorporating particle porosity and unique surface textures in the material characterization and show that interlocking between the crushed particles significantly influences the drained behavior of the pumice specimen.Keywords: pumice sand, triaxial compression, simulation, particle breakage
Procedia PDF Downloads 245