Search results for: intelligent computational techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8880

Search results for: intelligent computational techniques

3330 Evaluation of Erodibility Status of Soils in Some Areas of Imo and Abia States of Nigeria

Authors: Andy Obinna Ibeje

Abstract:

In this study, the erodibility indices and some soil properties of some cassava farms in selected areas of Abia and Imo States were investigated. This study involves taking measurements of some soil parameters such as permeability, soil texture and particle size analysis from which the erodibility indices were compared. Results showed that soils of the areas are very sandy. The results showed that Isiukwuato with index of 72 has the highest erodibility index. The results also showed that Arondizuogu with index of 34 has the least erodibility index. The results revealed that soil erodibility (k) values varied from 34 to 72. Nkporo has the highest sand content; Inyishie has the least silt content. The result indicates that there were respectively strong inverse relationship between clay and silt contents and erodibility index. On the other hand, sand, organic matter and moisture contents as well as soil permeability has significantly high positive correlation with soil erodibility and it can be concluded that particle size distribution is a major finger print on the erodibility index of soil in the study area. It is recommended that safe cultural practices like crop rotation, matching and adoption of organic farming techniques be incorporated into farming communities of Abia and Imo States in order to stem the advances of erosion in the study area.

Keywords: erodibility, indices, soil, sand

Procedia PDF Downloads 335
3329 Checking Energy Efficiency by Simulation Tools: The Case of Algerian Ksourian Models

Authors: Khadidja Rahmani, Nahla Bouaziz

Abstract:

Algeria is known for its rich heritage. It owns an immense historical heritage with a universal reputation. Unfortunately, this wealth is withered because of abundance. This research focuses on the Ksourian model, which constitutes a large portion of this wealth. In fact, the Ksourian model is not just a witness to a great part of history or a vernacular culture, but also it includes a panoply of assets in terms of energetic efficiency. In this context, the purpose of our work is to evaluate the performance of the old techniques which are derived from the Ksourian model , and that using the simulation tools. The proposed method is decomposed in two steps; the first consists of isolate and reintroduce each device into a basic model, then run a simulation series on acquired models. And this in order to test the contribution of each of these dialectal processes. In another scale of development, the second step consists of aggregating all these processes in an aboriginal model, then we restart the simulation, to see what it will give this mosaic on the environmental and energetic plan .The model chosen for this study is one of the ksar units of Knadsa city of Bechar (Algeria). This study does not only show the ingenuity of our ancestors in their know-how, and their adapting power to the aridity of the climate, but also proves that their conceptions subscribe in the current concerns of energy efficiency, and respond to the requirements of sustainable development.

Keywords: dialectal processes, energy efficiency, evaluation, Ksourian model, simulation tools

Procedia PDF Downloads 185
3328 Discipline-Specific Culture: A Purpose-Based Investigation

Authors: Sihem Benaouda

Abstract:

English is gaining an international identity as it affects every academic and professional field in the world. Without increasing their cultural understanding, it would obviously be difficult to completely educate learners for communication in a globalised environment. The concept of culture is intricate and needs to be elucidated, especially in an English language teaching (ELT) context. The study focuses on the investigation of the cultural studies integrated into the different types of English for specific purposes (ESP) materials, as opposed to English for general purposes (EGP) textbooks. A qualitative methodology based on a triangulation of techniques was conducted through materials analysis of five textbooks in both advanced EGP and three types of ESP. In addition to a semi-structured interview conducted with Algerian ESP practitioners, data analysis results revealed that culture in ESP textbooks is not overtly isolated into chapters and that cultural studies are predominantly present in business and economics materials, namely English for hotel and catering staff, tourism, and flight attendants. However, implicit cultural instruction is signalled in the social sciences and is negligible in science and technology sources. In terms of content, cultural studies in EGP are more related to generic topics, whereas, in some ESP materials, the topics are rather oriented to the specific field they belong to. Furthermore, the respondents’ answers showed an unawareness of the importance of culture in ESP teaching, besides some disregard for culture teaching per se in ESP contexts.

Keywords: ESP, EGP, cultural studies, textbooks, teaching, materials

Procedia PDF Downloads 98
3327 Effective Sexual Assault Treatment as Viewed by Survivors and Expert Therapists

Authors: Avigail Moor

Abstract:

Rape and sexual assault have been widely linked to severe psychological sequelae, the recovery from which often requires professional help. Thanks to the current shift in societal attitudes towards sexual violence, the victim's perspective is increasingly being heard. The present study is yet another step in that direction. Through the investigation of what recovered survivors of sexual assault identify as the therapeutic interventions that most assisted them in overcoming their trauma, guidelines for optimal sexual assault treatment are established. These receive further support from a comparison with expert therapists as to what they view as being most conducive to recovery from rape. In-depth semi-structured interviews were conducted with 15 survivors who have experienced a successful course of therapy and 15 therapists with extensive expertise in the field. The results document considerable agreement between the two perspectives, which share much in common. First, irrespective of the specific techniques involved, both survivors and therapists placed the greatest importance on a respectful and validating therapeutic relationship, that operates to counter the dehumanization and degradation entailed in the assault. In addition, specific interventions were identified, which include the reprocessing of all rape-specific peri-traumatic reactions coupled with the intentional countering of their consequences within the therapeutic relationship. Together, these reports provide a detailed account of post-rape treatment needs and the interventions required for their effective resolution.

Keywords: sexual assault, rape, treatment efficacy, survivors

Procedia PDF Downloads 137
3326 Estimation of Transition and Emission Probabilities

Authors: Aakansha Gupta, Neha Vadnere, Tapasvi Soni, M. Anbarsi

Abstract:

Protein secondary structure prediction is one of the most important goals pursued by bioinformatics and theoretical chemistry; it is highly important in medicine and biotechnology. Some aspects of protein functions and genome analysis can be predicted by secondary structure prediction. This is used to help annotate sequences, classify proteins, identify domains, and recognize functional motifs. In this paper, we represent protein secondary structure as a mathematical model. To extract and predict the protein secondary structure from the primary structure, we require a set of parameters. Any constants appearing in the model are specified by these parameters, which also provide a mechanism for efficient and accurate use of data. To estimate these model parameters there are many algorithms out of which the most popular one is the EM algorithm or called the Expectation Maximization Algorithm. These model parameters are estimated with the use of protein datasets like RS126 by using the Bayesian Probabilistic method (data set being categorical). This paper can then be extended into comparing the efficiency of EM algorithm to the other algorithms for estimating the model parameters, which will in turn lead to an efficient component for the Protein Secondary Structure Prediction. Further this paper provides a scope to use these parameters for predicting secondary structure of proteins using machine learning techniques like neural networks and fuzzy logic. The ultimate objective will be to obtain greater accuracy better than the previously achieved.

Keywords: model parameters, expectation maximization algorithm, protein secondary structure prediction, bioinformatics

Procedia PDF Downloads 466
3325 Stack Overflow Detection and Prevention on Operating Systems Using Machine Learning and Control-Flow Enforcement Technology

Authors: Cao Jiayu, Lan Ximing, Huang Jingjia, Burra Venkata Durga Kumar

Abstract:

The first virus to attack personal computers was born in early 1986, called C-Brain, written by a pair of Pakistani brothers. In those days, people still used dos systems, manipulating computers with the most basic command lines. In the 21st century today, computer performance has grown geometrically. But computer viruses are also evolving and escalating. We never stop fighting against security problems. Stack overflow is one of the most common security vulnerabilities in operating systems. It may result in serious security issues for an operating system if a program in it has a vulnerability with administrator privileges. Certain viruses change the value of specific memory through a stack overflow, allowing computers to run harmful programs. This study developed a mechanism to detect and respond to time whenever a stack overflow occurs. We demonstrate the effectiveness of standard machine learning algorithms and control flow enforcement techniques in predicting computer OS security using generating suspicious vulnerability functions (SVFS) and associated suspect areas (SAS). The method can minimize the possibility of stack overflow attacks occurring.

Keywords: operating system, security, stack overflow, buffer overflow, machine learning, control-flow enforcement technology

Procedia PDF Downloads 106
3324 In-Vivo Study of Annacardium occidentale L. Emulgel Extract Using Non-Invasive Probes

Authors: Akhtar Naveed, Kanwal Shahla, Khan HMS, Rasool Fatima, Ijaz Shakeel

Abstract:

The focus of the study was to design, develop and characterize in vivo, a stable Emulgel formulation containing Anacardium occidentale L.(cashew extract) as an active ingredient. The formulation was prepared and kept at 8ºC, 25 ºC, 40ºC and 40ºC±RH for a period of 28 days. During this time period, stability, pH values, conductivity, organoleptic features (color, liquefaction, phase separation) were conducted at the intervals of day 1st, 2nd, 3rd , 7th, 14th and 28th days. In In vivo studies, the test formulation (5% Anacardium occidentale L, extract) and a base formulation (without cashew extract) were prepared and both were applied on cheek areas of healthy human female volunteers, after the skin sensitivity test of each volunteer, for a study period of 8 weeks after getting consent from them. Various parameters of skin like Melanin level, Erythema level, and skin elasticity were measured at regular time intervals. Results of the study were analyzed by statistical techniques i.e. Two Way ANOVA and paired sample t-test. The result showed significant results as the p ≤ 0.05. Findings of paired sample t-test explained that test formulation have profound effects on skin parameters when compared with control formulation.

Keywords: Anacardium occientale L., anti-oxidant, cashew nut, emulgel

Procedia PDF Downloads 316
3323 The Mechanism of Design and Analysis Modeling of Performance of Variable Speed Wind Turbine and Dynamical Control of Wind Turbine Power

Authors: Mohammadreza Heydariazad

Abstract:

Productivity growth of wind energy as a clean source needed to achieve improved strategy in production and transmission and management of wind resources in order to increase quality of power and reduce costs. New technologies based on power converters that cause changing turbine speed to suit the wind speed blowing turbine improve extraction efficiency power from wind. This article introduces variable speed wind turbines and optimization of power, and presented methods to use superconducting inductor in the composition of power converter and is proposed the dc measurement for the wind farm and especially is considered techniques available to them. In fact, this article reviews mechanisms and function, changes of wind speed turbine according to speed control strategies of various types of wind turbines and examines power possible transmission and ac from producing location to suitable location for a strong connection integrating wind farm generators, without additional cost or equipment. It also covers main objectives of the dynamic control of wind turbines, and the methods of exploitation and the ways of using it that includes the unique process of these components. Effective algorithm is presented for power control in order to extract maximum active power and maintains power factor at the desired value.

Keywords: wind energy, generator, superconducting inductor, wind turbine power

Procedia PDF Downloads 320
3322 Cellular Energy Metabolism Decreases with Age in the Trophocytes and Oenocytes of Honeybees (Apis Mellifera)

Authors: Chin-Yuan Hsu, Yu-Lung Chuang

Abstract:

The expression, concentration, and activity of mitochondrial energy-utilized molecules and cellular energy-regulated molecules decreased with age in the trophocytes and oenocytes of honeybees (Apis mellifera), but those of cellular energy-metabolized molecules is unknown. In this study, the expression, concentration, and activity of cellular energy-metabolized molecules were assayed in the trophocytes and fat cells of young and old worker bees by using the techniques of cell and biochemistry. The results showed that (i) the •-hydroxylacyl-coenzyme A dehydrogenase (HOAD) activity/citrate synthase (CS) activity ratio, non-esterified fatty acids concentrations, the expression of eukaryotic initiation factor 4E, and the expression of phosphorylated eIF4E binding protein 1 decreased with age; (ii) fat and glycogen accumulation increased with age; and (iii) the pyruvate dehydrogenase (PDH) activity/citrate synthase (CS) activity ratio was not correlated with age. These finding indicated that •-oxidation (HOAD/CS) and protein synthsis decreased with age. Glycolysis (PDH/CS) was unchanged with age. The most likely reason is that sugars are the vital food of worker bees. Taken together these data reveal that young workers have higher cellular energy metabolism than old workers and that aging results in a decline in the cellular energy metabolism in worker honeybees.

Keywords: aging, energy, honeybee, metabolism

Procedia PDF Downloads 463
3321 Weakly Solving Kalah Game Using Artificial Intelligence and Game Theory

Authors: Hiba El Assibi

Abstract:

This study aims to weakly solve Kalah, a two-player board game, by developing a start-to-finish winning strategy using an optimized Minimax algorithm with Alpha-Beta Pruning. In weakly solving Kalah, our focus is on creating an optimal strategy from the game's beginning rather than analyzing every possible position. The project will explore additional enhancements like symmetry checking and code optimizations to speed up the decision-making process. This approach is expected to give insights into efficient strategy formulation in board games and potentially help create games with a fair distribution of outcomes. Furthermore, this research provides a unique perspective on human versus Artificial Intelligence decision-making in strategic games. By comparing the AI-generated optimal moves with human choices, we can explore how seemingly advantageous moves can, in the long run, be harmful, thereby offering a deeper understanding of strategic thinking and foresight in games. Moreover, this paper discusses the evaluation of our strategy against existing methods, providing insights on performance and computational efficiency. We also discuss the scalability of our approach to the game, considering different board sizes (number of pits and stones) and rules (different variations) and studying how that affects performance and complexity. The findings have potential implications for the development of AI applications in strategic game planning, enhancing our understanding of human cognitive processes in game settings, and offer insights into creating balanced and engaging game experiences.

Keywords: minimax, alpha beta pruning, transposition tables, weakly solving, game theory

Procedia PDF Downloads 43
3320 An Influence of Marketing Mix on Hotel Booking Decision: Japanese Senior Traveler Case

Authors: Kingkan Pongsiri

Abstract:

The study of marketing mix influencing on hotel booking decision making: Japanese senior traveler case aims to study the individual factors that are involved in the decision-making reservation for Japanese elderly travelers. Then, it aims to study other factors that influence the decision of tourists booking elderly Japanese people. This is a quantitative research methods, total of 420 completed questionnaires were collect via a Non-Probability sampling techniques. The study found that the majority of samples were female, 53.3 percent of 224 people aged between 66-70 years were 197, representing a 46.9 percent majority, the marital status of marriage is 212 per cent.50.5. Majority of samples have a bachelor degree of education with number of 326 persons (77.6 percentages) 50 percentages of samples (210 people) have monthly income in between 1,501-2,000 USD. The Samples mostly have a length of stay in a short period between 1-14 days counted as 299 people which representing 71.2 percentages of samples. The senior Japanese tourists apparently sensitive to the factors of products/services the most. Then they seem to be sensitive to the price, the marketing promotion and people, respectively. There are two factors identified as moderately influence to the Japanese senior tourists are places or distribution channels and physical evidences.

Keywords: Japanese senior traveler, marketing mix, senior tourist, hotel booking

Procedia PDF Downloads 287
3319 Improvement of Soft Clay Using Floating Cement Dust-Lime Columns

Authors: Adel Belal, Sameh Aboelsoud, Mohy Elmashad, Mohammed Abdelmonem

Abstract:

The two main criteria that control the design and performance of footings are bearing capacity and settlement of soil. In soft soils, the construction of buildings, storage tanks, warehouse, etc. on weak soils usually involves excessive settlement problems. To solve bearing capacity or reduce settlement problems, soil improvement may be considered by using different techniques, including encased cement dust–lime columns. The proposed research studies the effect of adding floating encased cement dust and lime mix columns to soft clay on the clay-bearing capacity. Four experimental tests were carried out. Columns diameters of 3.0 cm, 4.0 cm, and 5.0 cm and columns length of 60% of the clay layer thickness were used. Numerical model was constructed and verified using commercial finite element package (PLAXIS 2D, V8.5). The verified model was used to study the effect of distributing columns around the footing at different distances. The study showed that the floating cement dust lime columns enhanced the clay-bearing capacity with 262%. The numerical model showed that the columns around the footing have a limit effect on the clay improvement.

Keywords: bearing capacity, cement dust – lime columns, ground improvement, soft clay

Procedia PDF Downloads 191
3318 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network

Authors: Nasrin Bakhshizadeh, Ashkan Forootan

Abstract:

A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.

Keywords: polyethylene, polymerization, density, melt index, neural network

Procedia PDF Downloads 134
3317 Providing a Secure, Reliable and Decentralized Document Management Solution Using Blockchain by a Virtual Identity Card

Authors: Meet Shah, Ankita Aditya, Dhruv Bindra, V. S. Omkar, Aashruti Seervi

Abstract:

In today's world, we need documents everywhere for a smooth workflow in the identification process or any other security aspects. The current system and techniques which are used for identification need one thing, that is ‘proof of existence’, which involves valid documents, for example, educational, financial, etc. The main issue with the current identity access management system and digital identification process is that the system is centralized in their network, which makes it inefficient. The paper presents the system which resolves all these cited issues. It is based on ‘blockchain’ technology, which is a 'decentralized system'. It allows transactions in a decentralized and immutable manner. The primary notion of the model is to ‘have everything with nothing’. It involves inter-linking required documents of a person with a single identity card so that a person can go anywhere without having the required documents with him/her. The person just needs to be physically present at a place wherein documents are necessary, and using a fingerprint impression and an iris scan print, the rest of the verification will progress. Furthermore, some technical overheads and advancements are listed. This paper also aims to layout its far-vision scenario of blockchain and its impact on future trends.

Keywords: blockchain, decentralized system, fingerprint impression, identity management, iris scan

Procedia PDF Downloads 119
3316 Near Optimal Closed-Loop Guidance Gains Determination for Vector Guidance Law, from Impact Angle Errors and Miss Distance Considerations

Authors: Karthikeyan Kalirajan, Ashok Joshi

Abstract:

An optimization problem is to setup to maximize the terminal kinetic energy of a maneuverable reentry vehicle (MaRV). The target location, the impact angle is given as constraints. The MaRV uses an explicit guidance law called Vector guidance. This law has two gains which are taken as decision variables. The problem is to find the optimal value of these gains which will result in minimum miss distance and impact angle error. Using a simple 3DOF non-rotating flat earth model and Lockheed martin HP-MARV as the reentry vehicle, the nature of solutions of the optimization problem is studied. This is achieved by carrying out a parametric study for a range of closed loop gain values and the corresponding impact angle error and the miss distance values are generated. The results show that there are well defined lower and upper bounds on the gains that result in near optimal terminal guidance solution. It is found from this study, that there exist common permissible regions (values of gains) where all constraints are met. Moreover, the permissible region lies between flat regions and hence the optimization algorithm has to be chosen carefully. It is also found that, only one of the gain values is independent and that the other dependent gain value is related through a simple straight-line expression. Moreover, to reduce the computational burden of finding the optimal value of two gains, a guidance law called Diveline guidance is discussed, which uses single gain. The derivation of the Diveline guidance law from Vector guidance law is discussed in this paper.

Keywords: Marv guidance, reentry trajectory, trajectory optimization, guidance gain selection

Procedia PDF Downloads 413
3315 Recent Advances of Photo-Detectors in Single Photon Emission Computed Tomography Imaging System

Authors: Qasem A. Alyazji

Abstract:

One of the main techniques for Positron emission tomography (PET), Single photon emission computed tomography (SPECT) is the development of radiation detectors. The NaI(Tl) scintillator crystal coupled to an array of photomultiplier tubes known as the Anger camera, is the most dominant detectors system in PET and SPECT devices. Technological advances in many materials, in addition to the emerging importance of specialized applications such as preclinical imaging and cardiac imaging, have encouraged innovation so that alternatives to the anger camera are now part in alternative imaging systems. In this paper we will discuss the main performance characteristics of detectors devices and scanning developments in both scintillation detectors, semiconductor (solid state) detectors, and Photon Transducers such as photomultiplier tubes (PMTs), position sensitive photomultiplier tubes (PSPMTs), Avalanche photodiodes (APDs) and Silicon photomultiplier (SiPMT). This paper discussed the detectors that showed promising results. This study is a review of recent developments in the detectors used in single photon emission computed tomography (SPECT) imaging system.

Keywords: SPECT, scintillation, PMTs, SiPMT, PSPMTs, APDs, semiconductor (solid state)

Procedia PDF Downloads 148
3314 Preparation of Zno/Ag Nanocomposite and Coating on Polymers for Anti-Infection Biomaterial Application

Authors: Babak Sadeghi, Parisa Ghayomipour

Abstract:

ZnO/Ag nanocomposites coated with polyvinyl chloride (PVC) were prepared by chemical reduction method, for anti-infection biomaterial application. There is a growing interest in attempts in using biomolecular as the templates to grow inorganic nanocomposites in controlled morphology and structure. By optimizing the experiment conditions, we successfully fabricated high yield of ZnO/Ag nanocomposite with full coverage of high-density polyvinyl chloride (PVC) coating. More importantly, ZnO/Ag nanocomposites were shown to significantly inhibit the growth of S. aureus in solution. It was further shown that ZnO/Ag nanocomposites induced thiol depletion that caused death of S. aureus. The coatings were fully characterized using techniques such as scanning electron microscopy (SEM), transmission electron microscopy (TEM) and X-ray diffraction (XRD). Most importantly, compared to uncoated metals, the coatings on PVC promoted healthy antibacterial activity. Importantly, compared to ZnO-Ag -uncoated PVC, the ZnO/Ag nanocomposites coated was approximately triplet more effective in preventing bacteria attachment. The result of Thermal Gravimetric Analysis (TGA) indicates that, the ZnO/Ag nanocomposites are chemically stable in the temperature range from 50 to 900 ºC. This result, for the first time, demonstrates the potential of using ZnO/Ag nanocomposites as a coating material for numerous anti-bacterial applications.

Keywords: nanocomposites, antibacterial activity, scanning electron microscopy (SEM), x-ray diffraction (XRD)

Procedia PDF Downloads 462
3313 A Rationale to Describe Ambident Reactivity

Authors: David Ryan, Martin Breugst, Turlough Downes, Peter A. Byrne, Gerard P. McGlacken

Abstract:

An ambident nucleophile is a nucleophile that possesses two or more distinct nucleophilic sites that are linked through resonance and are effectively “in competition” for reaction with an electrophile. Examples include enolates, pyridone anions, and nitrite anions, among many others. Reactions of ambident nucleophiles and electrophiles are extremely prevalent at all levels of organic synthesis. The principle of hard and soft acids and bases (the “HSAB principle”) is most commonly cited in the explanation of selectivities in such reactions. Although this rationale is pervasive in any discussion on ambident reactivity, the HSAB principle has received considerable criticism. As a result, the principle’s supplantation has become an area of active interest in recent years. This project focuses on developing a model for rationalizing ambident reactivity. Presented here is an approach that incorporates computational calculations and experimental kinetic data to construct Gibbs energy profile diagrams. The preferred site of alkylation of nitrite anion with a range of ‘hard’ and ‘soft’ alkylating agents was established by ¹H NMR spectroscopy. Pseudo-first-order rate constants were measured directly by ¹H NMR reaction monitoring, and the corresponding second-order constants and Gibbs energies of activation were derived. These, in combination with computationally derived standard Gibbs energies of reaction, were sufficient to construct Gibbs energy wells. By representing the ambident system as a series of overlapping Gibbs energy wells, a more intuitive picture of ambident reactivity emerges. Here, previously unexplained switches in reactivity in reactions involving closely related electrophiles are elucidated.

Keywords: ambident, Gibbs, nucleophile, rates

Procedia PDF Downloads 73
3312 Digital Forensics Analysis Focusing on the Onion Router Browser Artifacts in Windows 10

Authors: Zainurrasyid Abdullah, Mohamed Fadzlee Sulaiman, Muhammad Fadzlan Zainal, M. Zabri Adil Talib, Aswami Fadillah M. Ariffin

Abstract:

The Onion Router (Tor) browser is a well-known tool and widely used by people who seeking for web anonymity when browsing the internet. Criminals are taking this advantage to be anonymous over the internet. Accessing the dark web could be the significant reason for the criminal in order for them to perform illegal activities while maintaining their anonymity. For a digital forensic analyst, it is crucial to extract the trail of evidence in proving that the criminal’s computer has used Tor browser to conduct such illegal activities. By applying the digital forensic methodology, several techniques could be performed including application analysis, memory analysis, and registry analysis. Since Windows 10 is the latest operating system released by Microsoft Corporation, this study will use Windows 10 as the operating system platform that running Tor browser. From the analysis, significant artifacts left by Tor browser were discovered such as the execution date, application installation date and browsing history that can be used as an evidence. Although Tor browser was designed to achieved anonymity, there is still some trail of evidence can be found in Windows 10 platform that can be useful for investigation.

Keywords: artifacts analysis, digital forensics, forensic analysis, memory analysis, registry analysis, tor browser, Windows 10

Procedia PDF Downloads 162
3311 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 119
3310 Pareto System of Optimal Placement and Sizing of Distributed Generation in Radial Distribution Networks Using Particle Swarm Optimization

Authors: Sani M. Lawal, Idris Musa, Aliyu D. Usman

Abstract:

The Pareto approach of optimal solutions in a search space that evolved in multi-objective optimization problems is adopted in this paper, which stands for a set of solutions in the search space. This paper aims at presenting an optimal placement of Distributed Generation (DG) in radial distribution networks with an optimal size for minimization of power loss and voltage deviation as well as maximizing voltage profile of the networks. And these problems are formulated using particle swarm optimization (PSO) as a constraint nonlinear optimization problem with both locations and sizes of DG being continuous. The objective functions adopted are the total active power loss function and voltage deviation function. The multiple nature of the problem, made it necessary to form a multi-objective function in search of the solution that consists of both the DG location and size. The proposed PSO algorithm is used to determine optimal placement and size of DG in a distribution network. The output indicates that PSO algorithm technique shows an edge over other types of search methods due to its effectiveness and computational efficiency. The proposed method is tested on the standard IEEE 34-bus and validated with 33-bus test systems distribution networks. Results indicate that the sizing and location of DG are system dependent and should be optimally selected before installing the distributed generators in the system and also an improvement in the voltage profile and power loss reduction have been achieved.

Keywords: distributed generation, pareto, particle swarm optimization, power loss, voltage deviation

Procedia PDF Downloads 355
3309 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.

Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management

Procedia PDF Downloads 332
3308 An Optimized Association Rule Mining Algorithm

Authors: Archana Singh, Jyoti Agarwal, Ajay Rana

Abstract:

Data Mining is an efficient technology to discover patterns in large databases. Association Rule Mining techniques are used to find the correlation between the various item sets in a database, and this co-relation between various item sets are used in decision making and pattern analysis. In recent years, the problem of finding association rules from large datasets has been proposed by many researchers. Various research papers on association rule mining (ARM) are studied and analyzed first to understand the existing algorithms. Apriori algorithm is the basic ARM algorithm, but it requires so many database scans. In DIC algorithm, less amount of database scan is needed but complex data structure lattice is used. The main focus of this paper is to propose a new optimized algorithm (Friendly Algorithm) and compare its performance with the existing algorithms A data set is used to find out frequent itemsets and association rules with the help of existing and proposed (Friendly Algorithm) and it has been observed that the proposed algorithm also finds all the frequent itemsets and essential association rules from databases as compared to existing algorithms in less amount of database scan. In the proposed algorithm, an optimized data structure is used i.e. Graph and Adjacency Matrix.

Keywords: association rules, data mining, dynamic item set counting, FP-growth, friendly algorithm, graph

Procedia PDF Downloads 411
3307 Computational Investigation of Secondary Flow Losses in Linear Turbine Cascade by Modified Leading Edge Fence

Authors: K. N. Kiran, S. Anish

Abstract:

It is well known that secondary flow loses account about one third of the total loss in any axial turbine. Modern gas turbine height is smaller and have longer chord length, which might lead to increase in secondary flow. In order to improve the efficiency of the turbine, it is important to understand the behavior of secondary flow and device mechanisms to curtail these losses. The objective of the present work is to understand the effect of a stream wise end-wall fence on the aerodynamics of a linear turbine cascade. The study is carried out computationally by using commercial software ANSYS CFX. The effect of end-wall on the flow field are calculated based on RANS simulation by using SST transition turbulence model. Durham cascade which is similar to high-pressure axial flow turbine for simulation is used. The aim of fencing in blade passage is to get the maximum benefit from flow deviation and destroying the passage vortex in terms of loss reduction. It is observed that, for the present analysis, fence in the blade passage helps reducing the strength of horseshoe vortex and is capable of restraining the flow along the blade passage. Fence in the blade passage helps in reducing the under turning by 70 in comparison with base case. Fence on end-wall is effective in preventing the movement of pressure side leg of horseshoe vortex and helps in breaking the passage vortex. Computations are carried for different fence height whose curvature is different from the blade camber. The optimum fence geometry and location reduces the loss coefficient by 15.6% in comparison with base case.

Keywords: boundary layer fence, horseshoe vortex, linear cascade, passage vortex, secondary flow

Procedia PDF Downloads 342
3306 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis

Authors: Shriya Shukla, Lachin Fernando

Abstract:

Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.

Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning

Procedia PDF Downloads 74
3305 Using an Epidemiological Model to Study the Spread of Misinformation during the Black Lives Matter Movement

Authors: Maryam Maleki, Esther Mead, Mohammad Arani, Nitin Agarwal

Abstract:

The proliferation of social media platforms like Twitter has heightened the consequences of the spread of misinformation. To understand and model the spread of misinformation, in this paper, we leveraged the SEIZ (Susceptible, Exposed, Infected, Skeptics) epidemiological model to describe the underlying process that delineates the spread of misinformation on Twitter. Compared to the other epidemiological models, this model produces broader results because it includes the additional Skeptics (Z) compartment, wherein a user may be Exposed to an item of misinformation but not engage in any reaction to it, and the additional Exposed (E) compartment, wherein the user may need some time before deciding to spread a misinformation item. We analyzed misinformation regarding the unrest in Washington, D.C. in the month of March 2020, which was propagated by the use of the #DCblackout hashtag by different users across the U.S. on Twitter. Our analysis shows that misinformation can be modeled using the concept of epidemiology. To the best of our knowledge, this research is the first to attempt to apply the SEIZ epidemiological model to the spread of a specific item of misinformation, which is a category distinct from that of rumor and hoax on online social media platforms. Applying a mathematical model can help to understand the trends and dynamics of the spread of misinformation on Twitter and ultimately help to develop techniques to quickly identify and control it.

Keywords: Black Lives Matter, epidemiological model, mathematical modeling, misinformation, SEIZ model, Twitter

Procedia PDF Downloads 152
3304 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and ECG-based systems are unquestionably the best choice due to their appealing inherent characteristics. The CNNs are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the calibre of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest FAR of 0.04 percent and the highest FRR of 5%, the best performing network achieved an identification accuracy of 99.94 percent. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, Dense Networks, Identification Rate, Train/Test split ratio

Procedia PDF Downloads 155
3303 Carbon Nitride Growth on ZnO Architectures for Enhanced Photoelectrochemical Water Splitting Application

Authors: Špela Hajduk, Sean P. Berglund, Matejka Podlogar, Goran Dražić, Fatwa F. Abdi, Zorica C. Orel, Menny Shalom

Abstract:

Graphitic carbon nitride materials (g-CN) have emerged as an attractive photocatalyst and electrocatalyst for photo and electrochemical water splitting reaction, due to their environmental benignity nature and suitable band gap. Many approaches were introduced to enhance the photoactivity and electronic properties of g-CN and resulted in significant changes in the electronic and catalytic properties. Here we demonstrate the synthesis of thin and homogenous g-CN layer on highly ordered ZnO nanowire (NW) substrate by growing a seeding layer of small supramolecular assemblies on the nanowires. The new synthetic approach leads to the formation of thin g-CN layer (~3 nm) without blocking all structure. Two different deposition methods of carbon nitride were investigated and will be presented. The amount of loaded carbon nitride significantly influences the PEC activity of hybrid material and all the ZnO/g-CNx electrodes show great improvement in photoactivity. The chemical structure, morphology and optical properties of the deposited g-CN were fully characterized by various techniques as X-ray powder spectroscopy (XRD), scanning electron microscopy (SEM), focused ion beam scanning electron microscopy (FIB-SEM), high-resolution scanning microscopy (HR-TEM) and X-ray photoelectron spectroscopy (XPS).

Keywords: carbon nitride, photoanode, solar water splitting, zinc oxide

Procedia PDF Downloads 185
3302 Online Graduate Students’ Perspective on Engagement in Active Learning in the United States

Authors: Ehi E. Aimiuwu

Abstract:

As of 2017, many researchers in educational journals are still wondering if students are effectively and efficiently engaged in active learning in the online learning environment. The goal of this qualitative single case study and narrative research is to explore if students are actively engaged in their online learning. Seven online students in the United States from LinkedIn and residencies were interviewed for this study. Eleven online learning techniques from research were used as a framework.  Data collection tools were used for the study that included a digital audiotape, observation sheet, interview protocol, transcription, and NVivo 12 Plus qualitative software.  Data analysis process, member checking, and key themes were used to reach saturation. About 85.7% of students preferred individual grading. About 71.4% of students valued professor’s interacting 2-3 times weekly, participating through posts and responses, having good internet access, and using email.  Also, about 57.1% said students log in 2-3 times weekly to daily, professor’s social presence helps, regular punctuality in work submission, and prefer assessments style of research, essay, and case study.  About 42.9% appreciated syllabus usefulness and professor’s expertise.

Keywords: class facilitation, course management, online teaching, online education, student engagement

Procedia PDF Downloads 122
3301 Mixed Traffic Speed–Flow Behavior under Influence of Road Side Friction and Non-Motorized Vehicles: A Comparative Study of Arterial Roads in India

Authors: Chetan R. Patel, G. J. Joshi

Abstract:

The present study is carried out on six lane divided urban arterial road in Patna and Pune city of India. Both the road having distinct differences in terms of the vehicle composition and the road side parking. Arterial road in Patan city has 33% of non-motorized mode, whereas Pune arterial road dominated by 65% of Two wheeler. Also road side parking is observed in Patna city. The field studies using vidiographic techniques are carried out for traffic data collection. Data are extracted for one minute duration for vehicle composition, speed variation and flow rate on selected arterial road of the two cities. Speed flow relationship is developed and capacity is determine. Equivalency factor in terms of dynamic car unit is determine to represent the vehicle is single unit. The variation in the capacity due to side friction, presence of non motorized traffic and effective utilization of lane width is compared at concluding remarks.

Keywords: arterial road, capacity, dynamic equivalency factor, effect of non motorized mode, side friction

Procedia PDF Downloads 343