Search results for: web usage mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2908

Search results for: web usage mining

898 Aerodynamic Heating and Drag Reduction of Pegasus-XL Satellite Launch Vehicle

Authors: Syed Muhammad Awais Tahir, Syed Hossein Raza Hamdani

Abstract:

In the last two years, there has been a substantial increase in the rate of satellite launches. To keep up with the technology, it is imperative that the launch cost must be made affordable, especially in developing and underdeveloped countries. Launch cost is directly affected by the launch vehicle’s aerodynamic performance. Pegasus-XL SLV (Satellite Launch Vehicle) has been serving as a commercial SLV for the last 26 years, commencing its commercial flight operation from the six operational sites all around the US and Europe, and the Marshal Islands. Aerodynamic heating and drag contribute largely to Pegasus’s flight performance. The objective of this study is to reduce the aerodynamic heating and drag on Pegasus’s body significantly for supersonic and hypersonic flight regimes. Aerodynamic data for Pegasus’s first flight has been validated through CFD (Computational Fluid Dynamics), and then drag and aerodynamic heating is reduced by using a combination of a forward-facing cylindrical spike and a conical aero-disk at the actual operational flight conditions. CFD analysis using ANSYS fluent will be carried out for Mach no. ranges from 0.83 to 7.8, and AoA (Angle of Attack) ranges from -4 to +24 degrees for both simple and spiked-configuration, and then the comparison will be drawn using a variety of graphs and contours. Expected drag reduction for supersonic flight is to be around 15% to 25%, and for hypersonic flight is to be around 30% to 50%, especially for AoA < 15⁰. A 5% to 10% reduction in aerodynamic heating is expected to be achieved for hypersonic regions. In conclusion, the aerodynamic performance of air-launched Pegasus-XL SLV can be further enhanced, leading to its optimal fuel usage to achieve a more economical orbital flight.

Keywords: aerodynamics, pegasus-XL, drag reduction, aerodynamic heating, satellite launch vehicle, SLV, spike, aero-disk

Procedia PDF Downloads 97
897 The Translation of Code-Switching in African Literature: Comparing the Two German Translations of Ngugi Wa Thiongo’s "Petals of Blood"

Authors: Omotayo Olalere

Abstract:

The relevance of code-switching for intercultural communication through literary translation cannot be overemphasized. The translation of code-switching and its implications for translations studies have been studied in the context of African literature. In these cases, code-switching was examined in the more general terms of its usage in source text and not particularly in Ngugi’s novels and its translations. In addition, the functions of translation and code-switching in the lyrics of some popular African songs have been studied, but this study is related more with oral performance than with written literature. As such, little has been done on the German translation of code-switching in African works. This study intends to fill this lacuna by examining the concept of code-switching in the German translations in Ngugi’s Petals of Blood. The aim is to highlight the significance of code-switching as a phenomenon in this African (Ngugi’s) novel written in English and to also focus on its representation in the two German translations. The target texts to be used are Verbrannte Blueten and Land der flammenden Blueten. “Abrogration“ as a concept will play an important role in the analysis of the data. Findings will show that the ideology of a translator plays a huge role in representing the concept of “abrogration” in the translation of code-switching in the selected source text. The study will contribute to knowledge in translation studies by bringing to limelight the need to foreground aspects of language contact in translation theory and practice, particularly in the African context. Relevant translation theories adopted for the study include Bandia’s (2008) postcolonial theory of translation and Snell-Hornby”s (1988) cultural translation theory.

Keywords: code switching, german translation, ngugi wa thiong’o, petals of blood

Procedia PDF Downloads 74
896 Influence of Gender, Race, and Psychiatric Disorders on Sun Protective Behavior and Outcomes: A Population-Based Study

Authors: Holly D. Shan, Monique L. Bautista Neughebauer

Abstract:

Sunscreen usage is emphasized in public health strategy as it reduces the risk of sunburns and skin cancers. This study aims to explore factors that influence sun protective behavior and outcomes. Data was received from the National Health Interview Survey (NHIS) 2020. Adults were asked how often they wore sunscreen when outside on a sunny day. Consistent use (“always”) of sunscreen, the incidence of sunburn within a year, and ever having a diagnosis of skin melanoma were compared by gender, race, and the diagnosis of anxiety, depression, and dementia. Individuals identifying as a mixed race were excluded. Statistical analysis was adjusted for large-scale surveys using STATA VSN 7.0, and a two-sided p<0.05 was considered significant. Of the 37,352 participants (53.18% females, 75.01% white, 10.49% black, 0.76% Indian Americans,5.60% Asian), 13.11% had a diagnosis of anxiety, 14.78% depression, and 0.84% dementia. Females wore sunscreen more often than males (24.72% vs. 10.91%, p<0.001). White individuals wore sunscreen most frequently; black individuals the least (17.37% vs. 6.49%, p<0.001). White individuals had the highest rate of sunburn (25.61%, p<0.001) and a history of skin melanoma (3.38%, p<0.001). Participants with anxiety, depression, and dementia all had statistically significantly decreased sunscreen use and increased frequency of sunburn compared to the general population. Only those with dementia had an increased incidence of skin melanoma (2.85% vs. 1.22%, p=0.009). Dermatologists and public health professionals should consider gender, race, and psychiatric comorbidities when counseling patients on sun protection.

Keywords: sun protective behavior, psychiatric disorder, melanoma, sunburn

Procedia PDF Downloads 83
895 Comparative Study Using WEKA for Red Blood Cells Classification

Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy

Abstract:

Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifying the RBCs as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-alaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively.

Keywords: K-nearest neighbors algorithm, radial basis function neural network, red blood cells, support vector machine

Procedia PDF Downloads 400
894 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores

Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter

Abstract:

Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.

Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment

Procedia PDF Downloads 120
893 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations

Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar

Abstract:

Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.

Keywords: cache behaviour, network-on-chip, performance profiling, vectorization

Procedia PDF Downloads 186
892 Impact of Artificial Intelligence Technologies on Information-Seeking Behaviors and the Need for a New Information Seeking Model

Authors: Mohammed Nasser Al-Suqri

Abstract:

Former information-seeking models are proposed more than two decades ago. These already existed models were given prior to the evolution of digital information era and Artificial Intelligence (AI) technologies. Lack of current information seeking models within Library and Information Studies resulted in fewer advancements for teaching students about information-seeking behaviors, design of library tools and services. In order to better facilitate the aforementioned concerns, this study aims to propose state-of-the-art model while focusing on the information seeking behavior of library users in the Sultanate of Oman. This study aims for the development, designing and contextualizing the real-time user-centric information seeking model capable of enhancing information needs and information usage along with incorporating critical insights for the digital library practices. Another aim is to establish far-sighted and state-of-the-art frame of reference covering Artificial Intelligence (AI) while synthesizing digital resources and information for optimizing information-seeking behavior. The proposed study is empirically designed based on a mix-method process flow, technical surveys, in-depth interviews, focus groups evaluations and stakeholder investigations. The study data pool is consist of users and specialist LIS staff at 4 public libraries and 26 academic libraries in Oman. The designed research model is expected to facilitate LIS by assisting multi-dimensional insights with AI integration for redefining the information-seeking process, and developing a technology rich model.

Keywords: artificial intelligence, information seeking, information behavior, information seeking models, libraries, Sultanate of Oman

Procedia PDF Downloads 106
891 'Typical' Criminals: A Schutzian Influenced Theoretical Framework Exploring Type and Stereotype Formation

Authors: Mariam Shah

Abstract:

The way the human mind interprets and comprehends the world it occupies has long been a topic of discussion amongst philosophers and phenomenologists. This paper will focus predominantly on the ideologies espoused by the phenomenologist Alfred Schutz and will investigate how we attribute meaning to an event through the process of typification, and the production and usage of ‘types' and ‘stereotypes.' This paper will then discuss how subjective ideologies innate within us result in unique and subjective decision outcomes, based on a phenomenologically influenced theoretical framework which will illustrate how we form ‘types’ in order to ‘typecast’ and form judgements of everything and everyone we experience. The framework used will be founded in theory espoused by Alfred Schutz, and will review the different types of knowledge we rely on innately to inform our judgements, the relevance we attribute to the information which we acquire, and how we consciously and unconsciously apply this framework to everyday situations. An assessment will then be made of the potential impact that these subjective meaning structures can present when dispensing justice in criminal courts. This paper will investigate how these subjective meaning structures can influence our consciousness on both a conscious and unconscious level, and how this could potentially result in bias judicial outcomes due to negative ‘types’ or ‘stereotypes.' This paper will ultimately illustrate that we unconsciously and unreflexively use pre-formed types and stereotypes to inform our judgements and give meaning to what we have just experienced.

Keywords: Alfred Schutz, criminal courts, decision making, judicial decision making, phenomenology, Schutzian stereotypes, types, typification

Procedia PDF Downloads 218
890 Sustainable Management of Water and Soil Resources for Agriculture in Dry Areas

Authors: Alireza Nejadmohammad Namaghi

Abstract:

Investigators have reported that mulches increase production potential in arid and semi arid lands. Mulches are covering materials that are used on soil surface for efficiency irrigation, erosion control, weed control, evaporation decrease and improvement of water perpetration. Our aim and local situation determine the kind of material that we can use. In this research we used different mulches including chemical mulch (M1), Aquasorb polymer, manure mulch (M2), Residue mulch (M3) and polyethylene mulch (M4), with control treatment (M0), without usage of mulch, on germination, biomass dry matter and cottonseed yield (Varamin variety) in Kashan area. Randomized complete block (RCB) design have measured the cotton yield with 3 replications for measuring the biomass dry matter and 4 replication in tow irrigation periods as 7 and 14 days. Germination percentage for M0, M1, M2, M3 and M4 treatment were receptivity 64, 65, 76, 57 and 72% Biomass dry matter average for M0, M1, M2, M3 and M4 treatment were receptivity 276, 306, 426, 403 and 476 gram per plot. M4 treatment (polyethylene Mulch) had the most effect, M2 and M3 had no significant as well as M0 and M1. Total yield average with respect to 7 days irrigation for M0, M1, M2, M3 and M4 treatment were receptivity 700, 725, 857, 1057 and 1273 gram per plot. Dunken ne multiple showed no significant different among M0, M1, M2, and M3, but M4 ahs the most effect on yield. Total yield average with respect to 14 days irrigation for M0, M1, M2, M3 and M4 treatment were receptivity 535, 507, 690, 957 and 1047 gram per plot. These were significant difference between all treatments and control treatment. Results showed that used different mulches with water decrease in dry situation can increase the yield significantly.

Keywords: mulch, cotton, arid land management, irrigation systems

Procedia PDF Downloads 75
889 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 134
888 Network Analysis of Genes Involved in the Biosynthesis of Medicinally Important Naphthodianthrone Derivatives of Hypericum perforatum

Authors: Nafiseh Noormohammadi, Ahmad Sobhani Najafabadi

Abstract:

Hypericins (hypericin and pseudohypericin) are natural napthodianthrone derivatives produced by Hypericum perforatum (St. John’s Wort), which have many medicinal properties such as antitumor, antineoplastic, antiviral, and antidepressant activities. Production and accumulation of hypericin in the plant are influenced by both genetic and environmental conditions. Despite the existence of different high-throughput data on the plant, genetic dimensions of hypericin biosynthesis have not yet been completely understood. In this research, 21 high-quality RNA-seq data on different parts of the plant were integrated into metabolic data to reconstruct a coexpression network. Results showed that a cluster of 30 transcripts was correlated with total hypericin. The identified transcripts were divided into three main groups based on their functions, including hypericin biosynthesis genes, transporters, detoxification genes, and transcription factors (TFs). In the biosynthetic group, different isoforms of polyketide synthase (PKSs) and phenolic oxidative coupling proteins (POCPs) were identified. Phylogenetic analysis of protein sequences integrated into gene expression analysis showed that some of the POCPs seem to be very important in the biosynthetic pathway of hypericin. In the TFs group, six TFs were correlated with total hypericin. qPCR analysis of these six TFs confirmed that three of them were highly correlated. The identified genes in this research are a rich resource for further studies on the molecular breeding of H. perforatum in order to obtain varieties with high hypericin production.

Keywords: hypericin, St. John’s Wort, data mining, transcription factors, secondary metabolites

Procedia PDF Downloads 80
887 A Fast Community Detection Algorithm

Authors: Chung-Yuan Huang, Yu-Hsiang Fu, Chuen-Tsai Sun

Abstract:

Community detection represents an important data-mining tool for analyzing and understanding real-world complex network structures and functions. We believe that at least four criteria determine the appropriateness of a community detection algorithm: (a) it produces useable normalized mutual information (NMI) and modularity results for social networks, (b) it overcomes resolution limitation problems associated with synthetic networks, (c) it produces good NMI results and performance efficiency for Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks, and (d) it produces good modularity and performance efficiency for large-scale real-world complex networks. To our knowledge, no existing community detection algorithm meets all four criteria. In this paper, we describe a simple hierarchical arc-merging (HAM) algorithm that uses network topologies and rule-based arc-merging strategies to identify community structures that satisfy the criteria. We used five well-studied social network datasets and eight sets of LFR benchmark networks to validate the ground-truth community correctness of HAM, eight large-scale real-world complex networks to measure its performance efficiency, and two synthetic networks to determine its susceptibility to resolution limitation problems. Our results indicate that the proposed HAM algorithm is capable of providing satisfactory performance efficiency and that HAM-identified communities were close to ground-truth communities in social and LFR benchmark networks while overcoming resolution limitation problems.

Keywords: complex network, social network, community detection, network hierarchy

Procedia PDF Downloads 217
886 Oral Administration of Azithromycin Ameliorates Trypanosomosis in Trypanosoma congolense and T. Brucei Brucei Infected Mice

Authors: Nthatisi I. Molefe-Nyembe, Keisuke Suganuma, Oriel M. M. Thekisoe, Xuan Xuenan, Noboru Inoue

Abstract:

African trypanosomosis is a devastating disease of animals caused by parasites of the genus Trypanosoma negatively affecting the economic status of more than 36 African countries. Few available drugs for the treatment of trypanosomosis remain inaccessible in remote areas, are associated with severe toxicity and most importantly, resistance has widely developed against their usage. Therefore, safe, effective and easily administrable drugs are urgently in need. The objective of the current study was to determine efficacy of azithromycin (AZM), on T. congolense, T. brucei brucei in vitro and in vivo. A 96 well luciferase assay was conducted to determine the trypanocidal effect of AZM on T. congolense, T. b. brucei and T. evansi as well as the cytotoxicity effect on the MDBK and NIH 3T3 cells. Additionally, TEM analysis was conducted to determine the morphological alteration on the AZM treated samples. Mice were infected with T. congolense and T. b. brucei and orally treated with AZM for 7 and 28 days referred to as the short and the long-term treatment. The in vitro IC50 values of AZM on T. congolense, T. b. brucei and T. evansi was 0.19 ± 0.17; 3.69 ± 2.26 and 1.81 ± 1.82 μg/mL, respectively, while the cytotoxicity effects values were greater than 25 μg/mL. A vacuole-like structure was observed in the TEM imaging of AZM treated T. congolense, while the presence of glycosomes and acidocalcisome-like structured were detected in T. b. brucei samples. In vivo, AZM was more effective against T. congolense infected mice than T. b. brucei. In conclusion, AZM exhibited the trypanocidal effects on T. congolense and T. b. brucei infected mice. However, further studies are necessary to determine the metabolic pathway responsible for the observed efficacy.

Keywords: animal trypanosomosis, azithromycin, oral administration, trypanosoma congolense

Procedia PDF Downloads 59
885 Social Media Factor in Security Environment

Authors: Cetin Arslan, Senol Tayan

Abstract:

Social media is one of the most important and effective means of social interaction among people in which they create, share and exchange their ideas via photos, videos or voice messages. Although there are lots of communication tools, social media sites are the most prominent ones that allows the users articulate themselves in a matter of seconds all around the world with almost any expenses and thus, they became very popular and widespread after its emergence. As the usage of social media increases, it becomes an effective instrument in social matters. While it is possible to use social media to emphasize basic human rights and protest some failures of any government as in “Arab Spring”, it is also possible to spread propaganda and misinformation just to cause long lasting insurgency, upheaval, turmoil or disorder as an instrument of intervention to internal affairs and state sovereignty by some hostile groups or countries. It is certain that “social media” has positive effects on democracies letting people have chance to express themselves and to organize, but it is also obvious that the misuse of it, is very common that even a five-minute-long video can cause to wage a campaign against a country. Although it looks anti-democratic, when you consider the catastrophic effects of misuse of social media, it is a kind of area that serious precautions are to be taken without limiting democratic rights while allowing constant and perpetual share but preventing the criminal events. This article begins with the current developments in social media and gives some examples on misuse of it. Second part tries to put emphasize on the legal basis that can prevent criminal activities and the upheavals and insurgencies against state security. Last part makes comparison between democratic countries and international organizations’’ actions against such activities and proposes some further actions that are compatible with democratic norms.

Keywords: democracy, disorder, security, Social Media

Procedia PDF Downloads 358
884 History of Textiles and Fashion: Gender Symbolism in the Context of Colour

Authors: Damayanthie Eluwawalage

Abstract:

Historically, the color-coded attire demarcated differences, for example, differences in social position and differences in gender, etc. Distinctive colors are worn by different classes in medieval England. By the twentieth-century Western society, certain colors were firmly associated with the specific gender; as pink for girls, and blue for boys. The color-coded gender phenomenon was a novelty at the turn of the twentieth-century and became widely practiced after World War II. Prior to that era, there were no distinctions or differences in the dress of younger children, in relation to their gender. In the nineteenth century, pink suits were highly acceptable for gentlemen’s attire. Frenchmen in the eighteenth-century wore colors with an infinite range of hues like pink, plum, white, cream, blue, yellow, puce and sea green. Nineteenth-century European male austerity, primarily caused by the usage of sombre colors such as black, white and grey, has been described as an element for dignity, control and morality. In the nineteenth century, there were many color-associated distinctions, as certain colors were reserved for the unmarried, the single or the aged. Two luminous colors in one dress was ‘vulgar’ and yellow was generally regarded as unladylike. Yellow was the color utilised for most correctional attire. Orange was prohibited for the unmarried. Fashionable dressing in the nineteenth century was more gender-differentiated than in previous centuries. Masculine austerity, emphasized a shift in class relations. As a result of that shift, male attire became more uniform, homogeneous and integrated (amongst the classes), than its traditional hierarchal approach.

Keywords: textiles, fashion, gender symbolism, color

Procedia PDF Downloads 478
883 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network

Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi

Abstract:

Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.

Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication

Procedia PDF Downloads 441
882 Characterization of Brewery Wastewater Composition

Authors: Abimbola M. Enitan, Josiah Adeyemo, Sheena Kumari, Feroz M. Swalaha, Faizal Bux

Abstract:

With the competing demand on water resources and water reuse, discharge of industrial effluents into the aquatic environment has become an important issue. Much attention has been placed on the impact of industrial wastewater on water bodies worldwide due to the accumulation of organic and inorganic matter in the receiving water bodies. The scope of the present work is to assess the physic-chemical composition of the wastewater produced from one of the brewery industry in South Africa. This is to estimate the environmental impact of its discharge into the receiving water bodies or the municipal treatment plant. The parameters monitored for the quantitative analysis of brewery wastewater include biological oxygen demand (BOD5), chemical oxygen demand (COD), total suspended solids, volatile suspended solids, ammonia, total oxidized nitrogen, nitrate, nitrite, phosphorus, and alkalinity content. In average, the COD concentration of the brewery effluent was 5340.97 mg/l with average pH values of 4.0 to 6.7. The BOD and the solids content of the wastewater from the brewery industry were high. This means that the effluent is very rich in organic content and its discharge into the water bodies or the municipal treatment plant could cause environmental pollution or damage the treatment plant. In addition, there were variations in the wastewater composition throughout the monitoring period. This might be as a result of different activities that take place during the production process, as well as the effects of the peak period of beer production on the water usage.

Keywords: Brewery wastewater, environmental pollution, industrial effluents, physic-chemical composition

Procedia PDF Downloads 444
881 Molecular Engineering of High-Performance Nanofiltration Membranes from Intrinsically Microporous Poly (Ether-Ether-Ketone)

Authors: Mahmoud A. Abdulhamid

Abstract:

Poly(ether-ether-ketone) (PEEK) has received increased attention due to its outstanding performance in different membrane applications including gas and liquid separation. However, it suffers from a semi-crystalline morphology, bad solubility and low porosity. To fabricate membranes from PEEK, the usage of harsh acid such as sulfuric acid is essential, regardless its hazardous properties. In this work, we report the molecular design of poly(ether-ether-ketones) (iPEEKs) with intrinsic porosity character, by incorporating kinked units into PEEK backbone such as spirobisindane, Tröger's base, and triptycene. The porous polymers were used to fabricate stable membranes for organic solvent nanofiltration application. To better understand the mechanism, we conducted molecular dynamics simulations to evaluate the possible interactions between the polymers and the solvents. Notable enhancement in separation performance was observed confirming the importance of molecular engineering of high-performance polymers. The iPEEKs demonstrated good solubility in polar aprotic solvents, a high surface area of 205–250 m² g⁻¹, and excellent thermal stability. Mechanically flexible nanofiltration membranes were prepared from N-methyl-2-pyrrolidone dope solution at iPEEK concentrations of 19–35 wt%. The molecular weight cutoff of the membranes was fine-tuned in the range of 450–845 g mol⁻¹ displaying 2–6 fold higher permeance (3.57–11.09 L m⁻² h⁻¹ bar⁻¹) than previous reports. The long-term stabilities were demonstrated by a 7 day continuous cross-flow filtration.

Keywords: molecular engineering, polymer synthesis, membrane fabrication, liquid separation

Procedia PDF Downloads 88
880 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates

Authors: Bongs Lainjo

Abstract:

Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.

Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum

Procedia PDF Downloads 161
879 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia PDF Downloads 374
878 Green Extraction of Patchoulol from Patchouli Leaves Using Ultrasound-Assisted Ionic Liquids

Authors: G. C. Jadeja, M. A. Desai, D. R. Bhatt, J. K. Parikh

Abstract:

Green extraction techniques are fast paving ways into various industrial sectors due to the stringent governmental regulations leading to the banning of toxic chemicals’ usage and also due to the increasing health/environmental awareness. The present work describes the ionic liquids based sonication method for selectively extracting patchoulol from the leaves of patchouli. 1-Butyl-3-methylimidazolium tetrafluoroborate ([Bmim]BF4) and N,N,N,N’,N’,N’-Hexaethyl-butane-1,4-diammonium dibromide (dicationic ionic liquid - DIL) were selected for extraction. Ultrasound assisted ionic liquid extraction was employed considering concentration of ionic liquid (4–8 %, w/w), ultrasound power (50–150 W for [Bmim]BF4 and 20–80 W for DIL), temperature (30–50 oC) and extraction time (30–50 min) as major parameters influencing the yield of patchoulol. Using the Taguchi method, the parameters were optimized and analysis of variance (ANOVA) was performed to find the most influential factor in the selected extraction method. In case of [Bmim]BF4, the optimum conditions were found to be: 4 % (w/w) ionic liquid concentration, 50 W power, 30 oC temperature and extraction time of 30 min. The yield obtained under the optimum conditions was 3.99 mg/g. In case of DIL, the optimum conditions were obtained as 6 % (w/w) ionic liquid concentration, 80 W power, 30 oC temperature and extraction time of 40 min, for which the yield obtained was 4.03 mg/g. Temperature was found to be the most significant factor in both the cases. Extraction time was the insignificant parameter while extracting the product using [Bmim]BF4 and in case of DIL, power was found to be the least significant factor affecting the process. Thus, a green method of recovering patchoulol is proposed.

Keywords: green extraction, ultrasound, patchoulol, ionic liquids

Procedia PDF Downloads 349
877 A Discrete Event Simulation Model to Manage Bed Usage for Non-Elective Admissions in a Geriatric Medicine Speciality

Authors: Muhammed Ordu, Eren Demir, Chris Tofallis

Abstract:

Over the past decade, the non-elective admissions in the UK have increased significantly. Taking into account limited resources (i.e. beds), the related service managers are obliged to manage their resources effectively due to the non-elective admissions which are mostly admitted to inpatient specialities via A&E departments. Geriatric medicine is one of specialities that have long length of stay for the non-elective admissions. This study aims to develop a discrete event simulation model to understand how possible increases on non-elective demand over the next 12 months affect the bed occupancy rate and to determine required number of beds in a geriatric medicine speciality in a UK hospital. In our validated simulation model, we take into account observed frequency distributions which are derived from a big data covering the period April, 2009 to January, 2013, for the non-elective admission and the length of stay. An experimental analysis, which consists of 16 experiments, is carried out to better understand possible effects of case studies and scenarios related to increase on demand and number of bed. As a result, the speciality does not achieve the target level in the base model although the bed occupancy rate decreases from 125.94% to 96.41% by increasing the number of beds by 30%. In addition, the number of required beds is more than the number of beds considered in the scenario analysis in order to meet the bed requirement. This paper sheds light on bed management for service managers in geriatric medicine specialities.

Keywords: bed management, bed occupancy rate, discrete event simulation, geriatric medicine, non-elective admission

Procedia PDF Downloads 213
876 Efficacy of Technology for Successful Learning Experience; Technology Supported Model for Distance Learning: Case Study of Botho University, Botswana

Authors: Ivy Rose Mathew

Abstract:

The purpose of this study is to outline the efficacy of technology and the opportunities it can bring to implement a successful delivery model in Distance Learning. Distance Learning has proliferated over the past few years across the world. Some of the current challenges faced by current students of distance education include lack of motivation, a sense of isolation and a need for greater and improved communication. Hence the author proposes a creative technology supported model for distance learning exactly mirrored on the traditional face to face learning that can be adopted by distance learning providers. This model suggests the usage of a range of technologies and social networking facilities, with the aim of creating a more engaging and sustaining learning environment to help overcome the isolation often noted by distance learners. While discussing the possibilities, the author also highlights the complexity and practical challenges of implementing such a model. Design/methodology/approach: Theoretical issues from previous research related to successful models for distance learning providers will be considered. And also the analysis of a case study from one of the largest private tertiary institution in Botswana, Botho University will be included. This case study illustrates important aspects of the distance learning delivery model and provides insights on how curriculum development is planned, quality assurance is done, and learner support is assured for successful distance learning experience. Research limitations/implications: While some of the aspects of this study may not be applicable to other contexts, a number of new providers of distance learning can adapt the key principles of this delivery model.

Keywords: distance learning, efficacy, learning experience, technology supported model

Procedia PDF Downloads 237
875 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit

Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana

Abstract:

Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.

Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification

Procedia PDF Downloads 147
874 A Comparative Study for Various Techniques Using WEKA for Red Blood Cells Classification

Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy

Abstract:

Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifyig the red blood cells as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively

Keywords: red blood cells, classification, radial basis function neural networks, suport vector machine, k-nearest neighbors algorithm

Procedia PDF Downloads 472
873 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 400
872 Evaluation of Liquid Fermentation Strategies to Obtain a Biofertilizer Based on Rhizobium sp.

Authors: Andres Diaz Garcia, Ana Maria Ceballos Rojas, Duvan Albeiro Millan Montano

Abstract:

This paper describes the initial technological development stages in the area of liquid fermentation required to reach the quantities of biomass of the biofertilizer microorganism Rhizobium sp. strain B02, for the application of the unitary stages downstream at laboratory scale. In the first stage, the adjustment and standardization of the fermentation process in conventional batch mode were carried out. In the second stage, various fed-batch and continuous fermentation strategies were evaluated in 10L-bioreactor in order to optimize the yields in concentration (Colony Forming Units/ml•h) and biomass (g/l•h), to make feasible the application of unit operations downstream of process. The growth kinetics, the evolution of dissolved oxygen and the pH profile generated in each of the strategies were monitored and used to make sequential adjustments. Once the fermentation was finished, the final concentration and viability of the obtained biomass were determined and performance parameters were calculated with the purpose of select the optimal operating conditions that significantly improved the baseline results. Under the conditions adjusted and standardized in batch mode, concentrations of 6.67E9 CFU/ml were reached after 27 hours of fermentation and a subsequent noticeable decrease was observed associated with a basification of the culture medium. By applying fed-batch and continuous strategies, significant increases in yields were achieved, but with similar concentration levels, which involved the design of several production scenarios based on the availability of equipment usage time and volume of required batch.

Keywords: biofertilizer, liquid fermentation, Rhizobium sp., standardization of processes

Procedia PDF Downloads 172
871 Analyzing Factors Impacting COVID-19 Vaccination Rates

Authors: Dongseok Cho, Mitchell Driedger, Sera Han, Noman Khan, Mohammed Elmorsy, Mohamad El-Hajj

Abstract:

Since the approval of the COVID-19 vaccine in late 2020, vaccination rates have varied around the globe. Access to a vaccine supply, mandated vaccination policy, and vaccine hesitancy contribute to these rates. This study used COVID-19 vaccination data from Our World in Data and the Multilateral Leaders Task Force on COVID-19 to create two COVID-19 vaccination indices. The first index is the Vaccine Utilization Index (VUI), which measures how effectively each country has utilized its vaccine supply to doubly vaccinate its population. The second index is the Vaccination Acceleration Index (VAI), which evaluates how efficiently each country vaccinated its population within its first 150 days. Pearson correlations were created between these indices and country indicators obtained from the World Bank. The results of these correlations identify countries with stronger health indicators, such as lower mortality rates, lower age dependency ratios, and higher rates of immunization to other diseases, displaying higher VUI and VAI scores than countries with lesser values. VAI scores are also positively correlated to Governance and Economic indicators, such as regulatory quality, control of corruption, and GDP per capita. As represented by the VUI, proper utilization of the COVID-19 vaccine supply by country is observed in countries that display excellence in health practices. A country’s motivation to accelerate its vaccination rates within the first 150 days of vaccinating, as represented by the VAI, was largely a product of the governing body’s effectiveness and economic status, as well as overall excellence in health practises.

Keywords: data mining, Pearson correlation, COVID-19, vaccination rates and hesitancy

Procedia PDF Downloads 107
870 Phytoextraction of Copper and Zinc by Willow Varieties in a Pot Experiment

Authors: Muhammad Mohsin, Mir Md Abdus Salam, Pertti Pulkkinen, Ari Pappinen

Abstract:

Soil and water contamination by heavy metals is a major challenging issue for the environment. Phytoextraction is an emerging, environmentally friendly and cost-efficient technology in which plants are used to eliminate pollutants from the soil and water. We aimed to assess the copper (Cu) and zinc (Zn) removal efficiency by two willow varieties such as Klara (S. viminalis x S. schwerinii x S. dasyclados) and Karin ((S.schwerinii x S. viminalis) x (S. viminalis x S.burjatica)) under different soil treatments (control/unpolluted, polluted, lime with polluted, wood ash with polluted). In 180 days of pot experiment, these willow varieties were grown in a highly polluted soil collected from Pyhasalmi mining area in Finland. The lime and wood ash were added to the polluted soil to improve the soil pH and observe their effects on metals accumulation in plant biomass. The Inductively Coupled Plasma Optical Emission Spectrometer (ELAN 6000 ICP-EOS, Perkin-Elmer Corporation) was used in this study to assess the heavy metals concentration in the plant biomass. The result shows that both varieties of willow have the capability to accumulate the considerable amount of Cu and Zn varying from 36.95 to 314.80 mg kg⁻¹ and 260.66 to 858.70 mg kg⁻¹, respectively. The application of lime and wood ash substantially affected the stimulation of the plant height, dry biomass and deposition of Cu and Zn into total plant biomass. Besides, the lime application appeared to upsurge Cu and Zn concentrations in the shoots and leaves in both willow varieties when planted in polluted soil. However, wood ash application was found more efficient to mobilize the metals in the roots of both varieties. The study recommends willow plantations to rehabilitate the Cu and Zn polluted soils.

Keywords: heavy metals, lime, phytoextraction, wood ash, willow

Procedia PDF Downloads 225
869 Data Analytics in Energy Management

Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair

Abstract:

With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.

Keywords: energy analytics, energy management, operational data, business intelligence, optimization

Procedia PDF Downloads 355