Search results for: computing experiment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3605

Search results for: computing experiment

3335 Condensation of Moist Air in Heat Exchanger Using CFD

Authors: Jan Barak, Karel Frana, Joerg Stiller

Abstract:

This work presents results of moist air condensation in heat exchanger. It describes theoretical knowledge and definition of moist air. Model with geometry of square canal was created for better understanding and post processing of condensation phenomena. Different approaches were examined on this model to find suitable software and model. Obtained knowledge was applied to geometry of real heat exchanger and results from experiment were compared with numerical results. One of the goals is to solve this issue without creating any user defined function in the applied code. It also contains summary of knowledge and outlook for future work.

Keywords: condensation, exchanger, experiment, validation

Procedia PDF Downloads 376
3334 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 459
3333 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption

Authors: Koyangbo Guere Monguia Michel Alex Emmanuel

Abstract:

In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.

Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization

Procedia PDF Downloads 330
3332 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 336
3331 Fuzzy Set Approach to Study Appositives and Its Impact Due to Positional Alterations

Authors: E. Mike Dison, T. Pathinathan

Abstract:

Computing with Words (CWW) and Possibilistic Relational Universal Fuzzy (PRUF) are the two concepts which widely represent and measure the vaguely defined natural phenomenon. In this paper, we study the positional alteration of the phrases by which the impact of a natural language proposition gets affected and/or modified. We observe the gradations due to sensitivity/feeling of a statement towards the positional alterations. We derive the classification and modification of the meaning of words due to the positional alteration. We present the results with reference to set theoretic interpretations.

Keywords: appositive, computing with words, possibilistic relational universal fuzzy (PRUF), semantic sentiment analysis, set-theoretic interpretations

Procedia PDF Downloads 130
3330 Survey of Access Controls in Cloud Computing

Authors: Monirah Alkathiry, Hanan Aljarwan

Abstract:

Cloud computing is one of the most significant technologies that the world deals with, in different sectors with different purposes and capabilities. The cloud faces various challenges in securing data from unauthorized access or modification. Consequently, security risks and levels have greatly increased. Therefore, cloud service providers (CSPs) and users need secure mechanisms that ensure that data are kept secret and safe from any disclosures or exploits. For this reason, CSPs need a number of techniques and technologies to manage and secure access to the cloud services to achieve security goals, such as confidentiality, integrity, identity access management (IAM), etc. Therefore, this paper will review and explore various access controls implemented in a cloud environment that achieve different security purposes. The methodology followed in this survey was conducting an assessment, evaluation, and comparison between those access controls mechanisms and technologies based on different factors, such as the security goals it achieves, usability, and cost-effectiveness. This assessment resulted in the fact that the technology used in an access control affects the security goals it achieves as well as there is no one access control method that achieves all security goals. Consequently, such a comparison would help decision-makers to choose properly the access controls that meet their requirements.

Keywords: access controls, cloud computing, confidentiality, identity and access management

Procedia PDF Downloads 108
3329 Building a Hierarchical, Granular Knowledge Cube

Authors: Alexander Denzler, Marcel Wehrle, Andreas Meier

Abstract:

A knowledge base stores facts and rules about the world that applications can use for the purpose of reasoning. By applying the concept of granular computing to a knowledge base, several advantages emerge. These can be harnessed by applications to improve their capabilities and performance. In this paper, the concept behind such a construct, called a granular knowledge cube, is defined, and its intended use as an instrument that manages to cope with different data types and detect knowledge domains is elaborated. Furthermore, the underlying architecture, consisting of the three layers of the storing, representing, and structuring of knowledge, is described. Finally, benefits as well as challenges of deploying it are listed alongside application types that could profit from having such an enhanced knowledge base.

Keywords: granular computing, granular knowledge, hierarchical structuring, knowledge bases

Procedia PDF Downloads 470
3328 Prediction of Fire Growth of the Office by Real-Scale Fire Experiment

Authors: Kweon Oh-Sang, Kim Heung-Youl

Abstract:

Estimating the engineering properties of fires is important to be prepared for the complex and various fire risks of large-scale structures such as super-tall buildings, large stadiums, and multi-purpose structures. In this study, a mock-up of a compartment which was 2.4(L) x 3.6 (W) x 2.4 (H) meter in dimensions was fabricated at the 10MW LSC (Large Scale Calorimeter) and combustible office supplies were placed in the compartment for a real-scale fire test. Maximum heat release rate was 4.1 MW and total energy release obtained through the application of t2 fire growth rate was 6705.9 MJ.

Keywords: fire growth, fire experiment, t2 curve, large scale calorimeter

Procedia PDF Downloads 308
3327 Proposed Anticipating Learning Classifier System for Cloud Intrusion Detection (ALCS-CID)

Authors: Wafa' Slaibi Alsharafat

Abstract:

Cloud computing is a modern approach in network environment. According to increased number of network users and online systems, there is a need to help these systems to be away from unauthorized resource access and detect any attempts for privacy contravention. For that purpose, Intrusion Detection System is an effective security mechanism to detect any attempts of attacks for cloud resources and their information. In this paper, Cloud Intrusion Detection System has been proposed in term of reducing or eliminating any attacks. This model concerns about achieving high detection rate after conducting a set of experiments using benchmarks dataset called KDD'99.

Keywords: IDS, cloud computing, anticipating classifier system, intrusion detection

Procedia PDF Downloads 448
3326 [Keynote Speech]: Risk Management during the Rendition Process: Use of Screen-Voice Recordings in Translator Training

Authors: Maggie Hui

Abstract:

Risk management is not a new concept; however, it is an uncharted area as applied to the translation process and translator training. Serving as one of the self-discovery activities in their practicum course, a two-cycle experiment was carried out with a class of 13 MA translation students with an attempt to explore their risk management while translating in a simulated setting that involves translator-client relations. To test the effects of the main variable of translators’ interaction with the simulated clients, the researcher employed control-group translators and two experiment groups (with Group A being the translator in Cycle 1 and the client in Cycle 2, and Group B on the client position in Cycle 1 and the translator position in Cycle 2). Experiment cycle 1 aims to explore if there would be any behavioral difference in risk management between translators with interaction with the simulated clients, i.e. experiment group A, and their counterparts without such interaction, i.e. control group. Design of Cycle 2 concerns the order of playing different roles of the translator and client in the experiment, and provides information to compare behavior of translators of the two experiment groups. Since this is process-oriented research, it is necessary to hypothesize what was happening in the translators’ minds. The researcher made use of a user-friendly screen-voice recording freeware to record subjects’ screen activities, including every word the translator typed and every change they made to the rendition, the websites they browsed and the reference tools they used, in addition to the verbalization of their thoughts throughout the process. The research observes the translation procedures subjects considered and finally adopted, and looks into the justifications for their procedures, in order to interpret their risk management. The qualitative and quantitative results of this study have some implications for translator training: (a) the experience of being a client seems to reinforce the translator’s risk aversion; (b) the use of role-playing simulation can empower students’ learning by enhancing their attitudinal or psycho-physiological competence, interpersonal competence and strategic competence; and (c) the screen-voice recordings serve as a helpful tool for learners to reflect on their rendition processes, i.e. what they performed satisfactorily and unsatisfactorily while translating and what they could do for improvement in future translation tasks.

Keywords: risk management, screen-voice recordings, simulated translator-client relations, translation pedagogy, translation process-oriented research

Procedia PDF Downloads 245
3325 Effects of Rumen Protozoa and Nitrate on Fermentation and Methane Production

Authors: S. H. Nguyen, L. Li, R. S. Hegarty

Abstract:

Two experiments were conducted assessing the effects of presence or absence of rumen protozoa and dietary nitrate addition on rumen fermentation characteristics and methane production in Brahman heifers. The first experiment assessed changes in rumen fermentation pattern and in-vitro methane production post-refaunation and the second experiment investigated whether addition of nitrate to the incubation would give rise to methane mitigation additional to that contributed by defaunation. Ten Brahman heifers were progressively adapted to a diet containing coconut oil distillate 4.5% (COD) for 18 d and then all heifers were defaunated using sodium 1-(2-sulfonatooxyethoxy) dodecane (Empicol). After 15 d, the heifers were given a second dose of Empicol. Fifteen days after the second dosing, all heifers were allocated to defaunated or refaunated groups by stratified randomisation. On d 48, an oral dose of rumen fluid collected from unrelated faunated cattle was used to inoculate 5 heifers and form a refaunated group so that the effects of re-establishment of protozoa on fermentation characteristics could be investigated. Samples of rumen fluid collected from each animal using oesophageal intubation before feeding on d 48, 55, 62 and 69 were incubated for 23h in-vitro (experiment 1). On day 82, 2% of NO3 (as NaNO3) was included in in-vitro incubations (experiment 2) to test for additivity of NO3 and absence of protozoa effects on fermentation and methane production. It was concluded that increasing protozoal numbers were associated with increased methane production, with methane production rate significantly higher from refaunated heifers than from defaunated heifers 7, 14 and 21 d after refaunation. Concentration and proportions of major VFA, however, were not affected by protozoal treatments. There is scope for further reducing methane output through combining defaunation and dietary nitrate as the addition of nitrate in the defaunated heifers resulted in 86% reduction in methane production in-vitro.

Keywords: defaunation, nitrate, fermentation, methane production

Procedia PDF Downloads 528
3324 KBASE Technological Framework - Requirements

Authors: Ivan Stanev, Maria Koleva

Abstract:

Automated software development issues are addressed in this paper. Layers and packages of a Common Platform for Automated Programming (CPAP) are defined based on Service Oriented Architecture, Cloud computing, Knowledge based automated software engineering (KBASE) and Method of automated programming. Tools of seven leading companies (AWS of Amazon, Azure of Microsoft, App Engine of Google, vCloud of VMWare, Bluemix of IBM, Helion of HP, OCPaaS of Oracle) are analyzed in the context of CPAP. Based on the results of the analysis CPAP requirements are formulated

Keywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture

Procedia PDF Downloads 270
3323 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 367
3322 Forward Speed and Draught Requirement of a Semi-Automatic Cassava Planter under Different Wheel Usage

Authors: Ale M. O., Manuwa S. I., Olukunle O. J., Ewetumo T.

Abstract:

Five varying speeds of 1.5, 1.8, 2.1, 2.3, and 2.6 km/h were used at a constant soil depth of 100 mm to determine the effects of forward speed on the draught requirement of a semi-automatic cassava planter under the pneumatic wheel and rigid wheel usage on a well prepared sandy clay loam soil. The soil draught was electronically measured using an on-the-go soil draught measuring instrumentation system developed for the purpose of this research. The results showed an exponential relationship between forward speed and draught, in which draught ranging between 24.91 and 744.44N increased with an increase in forward speed in the rigid wheel experiment. This is contrary to the polynomial relationship observed in the pneumatic wheel experiment in which the draught varied between 96.09 and 343.53 N. It was observed in the experiments that the optimum speed of 1.5 km/h had the least values of draught in both the pneumatic wheel and rigid wheel experiments, with higher values in the pneumatic experiment. It was generally noted that the rigid wheel planter with less value of draught requires less energy required for operation. It is therefore concluded that operating the semi-automatic cassava planter with rigid wheels will be more economical for cassava farmers than operating the planter with pneumatic wheels.

Keywords: Cassava planter, planting, forward speed, draught, wheel type

Procedia PDF Downloads 70
3321 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 307
3320 Implementing Green IT Practices in Non-IT Industries in Sri Lanka: Contemplating the Feasibility and Methods to Ensure Sustainability

Authors: Manuela Nayantara Jeyaraj

Abstract:

Green IT is a term that refers to the collective strategic and tactical practices that unswervingly condense the carbon footprint to a diminished proportion in an establishment’s computing procedures. This concept has been tightly knit with IT related organizations; hence it has been precluded to be applied within non-IT organizations in Sri Lanka. With the turn of the century, computing technologies have taken over commonplace activities in every nook and corner in Sri Lanka, which is still on the verge of moving forth in its march towards being a developed country. Hence, it needs to be recursively proven that non-IT industries are well-bound to adhere to ‘Green IT’ practices as well, in order to reduce their carbon footprint and move towards considering the practicality of implementing Green-IT practices within their work-arounds. There are several spheres that need to be taken into account in creating awareness of ‘Green IT’, such as the economic breach, technologies available, legislative bounds, community mind-set and many more. This paper tends to reconnoiter causes that currently restrain non-IT organizations from considering Green IT concepts. By doing so, it is expected to prove the beneficial providence gained by implementing this concept within the organization. The ultimate goal is to propose feasible ‘Green IT’ practices that could be implemented within the context of Sri Lankan non-IT sectors in order to ensure that organization’s sustainable growth towards a long term existence.

Keywords: computing practices, Green IT, non-IT industries, Sri Lanka, sustainability

Procedia PDF Downloads 227
3319 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”

Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen

Abstract:

Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.

Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval

Procedia PDF Downloads 149
3318 MLOps Scaling Machine Learning Lifecycle in an Industrial Setting

Authors: Yizhen Zhao, Adam S. Z. Belloum, Goncalo Maia Da Costa, Zhiming Zhao

Abstract:

Machine learning has evolved from an area of academic research to a real-word applied field. This change comes with challenges, gaps and differences exist between common practices in academic environments and the ones in production environments. Following continuous integration, development and delivery practices in software engineering, similar trends have happened in machine learning (ML) systems, called MLOps. In this paper we propose a framework that helps to streamline and introduce best practices that facilitate the ML lifecycle in an industrial setting. This framework can be used as a template that can be customized to implement various machine learning experiment. The proposed framework is modular and can be recomposed to be adapted to various use cases (e.g. data versioning, remote training on cloud). The framework inherits practices from DevOps and introduces other practices that are unique to the machine learning system (e.g.data versioning). Our MLOps practices automate the entire machine learning lifecycle, bridge the gap between development and operation.

Keywords: cloud computing, continuous development, data versioning, DevOps, industrial setting, MLOps

Procedia PDF Downloads 236
3317 A Review on Applications of Experts Systems in Medical Sciences

Authors: D. K. Sreekantha, T. M. Girish, R. H. Fattepur

Abstract:

In this article, we have given an overview of medical expert systems, which can be used for the developed of physicians in making decisions such as appropriate, prognostic, and therapeutic decisions which help to organize, store, and gives appropriate medical knowledge needed by physicians and practitioners during medical operations or further treatment. If they support the studies by using these systems, advanced tools in medicine will be developed in the future. New trends in the methodology of development of medical expert systems have also been discussed in this paper. So Authors would like to develop an innovative IT based solution to help doctors in rural areas to gain expertise in Medical Science for treating patients. This paper aims to survey the Soft Computing techniques in treating patient’s problems used throughout the world.

Keywords: expert system, fuzzy logic, knowledge base, soft computing, epilepsy

Procedia PDF Downloads 238
3316 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 61
3315 Artificial Neurons Based on Memristors for Spiking Neural Networks

Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi

Abstract:

Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.

Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity

Procedia PDF Downloads 98
3314 Architecture of a Preliminary Course on Computational Thinking

Authors: Mintu Philip, Renumol V. G.

Abstract:

An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.

Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking

Procedia PDF Downloads 421
3313 Assessment and Control for Oil Aerosol

Authors: Chane-Yu Lai, Xiang-Yu Huang

Abstract:

This study conducted an assessment of sampling result by using the new development rotation filtration device (RFD) filled with porous media filters integrating the method of cyclone centrifugal spins. The testing system established for the experiment used corn oil and potassium sodium tartrate tetrahydrate (PST) as challenge aerosols and were produced by using an Ultrasonic Atomizing Nozzle, a Syringe Pump, and a Collison nebulizer. The collection efficiency of RFD for oil aerosol was assessed by using an Aerodynamic Particle Sizer (APS) and a Fidas® Frog. The results of RFD for the liquid particles condition indicated the cutoff size was 1.65 µm and 1.02 µm for rotation of 0 rpm and 9000 rpm, respectively, under an 80 PPI (pores per inch)foam with a thickness of 80 mm, and sampling velocity of 13.5 cm/s. As the experiment increased the foam thickness of RFD, the cutoff size reduced from 1.62 µm to 1.02 µm. However, when increased the foam porosity of RFD, the cutoff size reduced from 1.26 µm to 0.96 µm. Moreover, as increased the sampling velocity of RFD, the cutoff size reduced from 1.02 µm to 0.76 µm. These discrepancies of above cutoff sizes of RFD all had statistical significance (P < 0.05). The cutoff size of RFD for three experimental conditions of generated liquid oil particles, solid PST particles or both liquid oil and solid PST particles was 1.03 µm, 1.02 µm, or 0.99 µm, respectively, under a 80 PPI foam with thickness of 80 mm, rotation of 9000 rpm, and sampling velocity of 13.5 cm/s. In addition, under the best condition of the experiment, two hours of sampling loading, the RFD had better collection efficiency for particle diameter greater than 0.45 µm, under a 94 PPI nickel mesh with a thickness of 68 mm, rotation of 9000 rpm, and sampling velocity of 108.3 cm/s. The experiment concluded that increased the thickness of porous media, face velocity, and porosity of porous media of RFD could increase the collection efficiency of porous media for sampling oil particles. Moreover, increased the rotation speed of RFD also increased the collection efficiency for sampling oil particles. Further investigation is required for those above operation parameters for RFD in this study in the future.

Keywords: oil aerosol, porous media filter, rotation, filtration

Procedia PDF Downloads 374
3312 Investment Casting Conditions with Tourmaline In-Situ

Authors: Kageeporn Wongpreedee, Bongkot Phichaikamjornwut, Duangkhae Bootkul

Abstract:

The technique of stone in place casting had been established in jewelry production for two decades. However, the process were not widely used since it was limited to precious stones with high hardness and high stabililty at high temperature. This experiment were tested on tourmaline which is semi-precious gemstone having less hardness and less stability comparing to precious stones. The experiment were designed into two parts. The first part is to understand the phenomena of tourmaline under the heating conditions. Natural tourmaline stones were investigated and compared inclusions inside stones tested at temperature of 500 °C, 600 °C, and 700 °C. The second part is to cast the treated tourmaline with ion-implanation under the stones in place casting conditions. The results showed that stones were able to tolerate as much as at 700 °C showing the growths of inclusions inside the stones. The second part of this experiment were compared tourmaline with ion-implantation and natural tourmaline using on stones in place casting process at different stone setting types. The results showed that the cracks and inclustions of both treat and natural tourmaline with stones in place casting were propagate due to high stress of metal contractions. The stones with ion-implatation were more likely tolerate to cracks and inclusion propagations inside the stones.

Keywords: stone in place casting, tourmaline, ion implantation, metal contraction

Procedia PDF Downloads 194
3311 Impact of Similarity Ratings on Human Judgement

Authors: Ian A. McCulloh, Madelaine Zinser, Jesse Patsolic, Michael Ramos

Abstract:

Recommender systems are a common artificial intelligence (AI) application. For any given input, a search system will return a rank-ordered list of similar items. As users review returned items, they must decide when to halt the search and either revise search terms or conclude their requirement is novel with no similar items in the database. We present a statistically designed experiment that investigates the impact of similarity ratings on human judgement to conclude a search item is novel and halt the search. 450 participants were recruited from Amazon Mechanical Turk to render judgement across 12 decision tasks. We find the inclusion of ratings increases the human perception that items are novel. Percent similarity increases novelty discernment when compared with star-rated similarity or the absence of a rating. Ratings reduce the time to decide and improve decision confidence. This suggests the inclusion of similarity ratings can aid human decision-makers in knowledge search tasks.

Keywords: ratings, rankings, crowdsourcing, empirical studies, user studies, similarity measures, human-centered computing, novelty in information retrieval

Procedia PDF Downloads 92
3310 An Experiment of Three-Dimensional Point Clouds Using GoPro

Authors: Jong-Hwa Kim, Mu-Wook Pyeon, Yang-dam Eo, Ill-Woong Jang

Abstract:

Construction of geo-spatial information recently tends to develop as multi-dimensional geo-spatial information. People constructing spatial information is also expanding its area to the general public from some experts. As well as, studies are in progress using a variety of devices, with the aim of near real-time update. In this paper, getting the stereo images using GoPro device used widely also to the general public as well as experts. And correcting the distortion of the images, then by using SIFT, DLT, is acquired the point clouds. It presented a possibility that on the basis of this experiment, using a video device that is readily available in real life, to create a real-time digital map.

Keywords: GoPro, SIFT, DLT, point clouds

Procedia PDF Downloads 441
3309 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 103
3308 How Much the Role of Fertilizers Management and Wheat Planting Methods on Its Yield Improvement?

Authors: Ebrahim Izadi-Darbandi, Masoud Azad, Masumeh Dehghan

Abstract:

In order to study the effects of nitrogen and phosphoruse management and wheat sowing method on wheat yield, two experiments was performed as factorial, based on completely randomized design with three replications at Research Farm, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran in 2009. In the first experiment nitrogen application rates (100kg ha-1, 200 kg ha-1, 300 kg ha-1), phosphorus application rates (100 kg ha-1, 200 kg ha-1) and two levels of their application methods (Broadcast and Band) were studied. The second experiment treatments included of wheat sowing methods (single-row with 30 cm distance and twine row on 60 cm width ridges), as main plots and nitrogen and phosphorus application methods (Broadcast and Band) as sub plots (150 kg ha-1). Phosphorus and nitrogen sources for fertilization at both experiment were respectively super phosphate, applied before wheat sowing and incorporated with soil and urea, applied in two phases (50% pre plant) and (50%) near wheat shooting. Results from first experiment showed that the effect of fertilizers application methods were significant (p≤0.01) on wheat yield increasing. Band application of phosphorus and nitrogen were increased biomass and seed yield of wheat with nine and 15% respectively compared to their broadcast application. The interaction between the effects of nitrogen and phosphorus application rate with phosphorus and nitrogen application methods, showed that band application of fertilizers and the rate of application of 200kg/ha phosphorus and 300kg/ha nitrogen were the best methods in wheat yield improvement. The second experiment also showed that the effect of wheat sowing method and fertilizers application methods were significant (p≤0.01) on wheat seed and biomass yield improvement. Wheat twine row on 60 cm width ridges sowing method, increased its biomass and seed yield for 22% and 30% respectively compared to single-row with 30 cm. Wheat sowing method and fertilizers application methods interaction indicated that band application of fertilizers and wheat twine row on 60 cm width ridges sowing method was the best treatment on winter wheat yield improvement. In conclusion these results indicated that nitrogen and phosphorus management in wheat and modifying wheat sowing method have important role in increasing fertilizers use efficiency.

Keywords: band application, broadcast application, rate of fertilizer application, wheat seed yield, wheat biomass yield

Procedia PDF Downloads 437
3307 Map UI Design of IoT Application Based on Passenger Evacuation Behaviors in Underground Station

Authors: Meng-Cong Zheng

Abstract:

When the public space is in an emergency, how to quickly establish spatial cognition and emergency shelter in the closed underground space is the urgent task. This study takes Taipei Station as the research base and aims to apply the use of Internet of things (IoT) application for underground evacuation mobility design. The first experiment identified passengers' evacuation behaviors and spatial cognition in underground spaces by wayfinding tasks and thinking aloud, then defined the design conditions of User Interface (UI) and proposed the UI design.  The second experiment evaluated the UI design based on passengers' evacuation behaviors by wayfinding tasks and think aloud again as same as the first experiment. The first experiment found that the design conditions that the subjects were most concerned about were "map" and hoping to learn the relative position of themselves with other landmarks by the map and watch the overall route. "Position" needs to be accurately labeled to determine the location in underground space. Each step of the escape instructions should be presented clearly in "navigation bar." The "message bar" should be informed of the next or final target exit. In the second experiment with the UI design, we found that the "spatial map" distinguishing between walking and non-walking areas with shades of color is useful. The addition of 2.5D maps of the UI design increased the user's perception of space. Amending the color of the corner diagram in the "escape route" also reduces the confusion between the symbol and other diagrams. The larger volume of toilets and elevators can be a judgment of users' relative location in "Hardware facilities." Fire extinguisher icon should be highlighted. "Fire point tips" of the UI design indicated fire with a graphical fireball can convey precise information to the escaped person. "Fire point tips" of the UI design indicated fire with a graphical fireball can convey precise information to the escaped person. However, "Compass and return to present location" are less used in underground space.

Keywords: evacuation behaviors, IoT application, map UI design, underground station

Procedia PDF Downloads 175
3306 1D Convolutional Networks to Compute Mel-Spectrogram, Chromagram, and Cochleogram for Audio Networks

Authors: Elias Nemer, Greg Vines

Abstract:

Time-frequency transformation and spectral representations of audio signals are commonly used in various machine learning applications. Training networks on frequency features such as the Mel-Spectrogram or Cochleogram have been proven more effective and convenient than training on-time samples. In practical realizations, these features are created on a different processor and/or pre-computed and stored on disk, requiring additional efforts and making it difficult to experiment with different features. In this paper, we provide a PyTorch framework for creating various spectral features as well as time-frequency transformation and time-domain filter-banks using the built-in trainable conv1d() layer. This allows computing these features on the fly as part of a larger network and enabling easier experimentation with various combinations and parameters. Our work extends the work in the literature developed for that end: First, by adding more of these features and also by allowing the possibility of either starting from initialized kernels or training them from random values. The code is written as a template of classes and scripts that users may integrate into their own PyTorch classes or simply use as is and add more layers for various applications.

Keywords: neural networks Mel-Spectrogram, chromagram, cochleogram, discrete Fourrier transform, PyTorch conv1d()

Procedia PDF Downloads 200