Search results for: edge data centers
25004 BigCrypt: A Probable Approach of Big Data Encryption to Protect Personal and Business Privacy
Authors: Abdullah Al Mamun, Talal Alkharobi
Abstract:
As data size is growing up, people are became more familiar to store big amount of secret information into cloud storage. Companies are always required to need transfer massive business files from one end to another. We are going to lose privacy if we transmit it as it is and continuing same scenario repeatedly without securing the communication mechanism means proper encryption. Although asymmetric key encryption solves the main problem of symmetric key encryption but it can only encrypt limited size of data which is inapplicable for large data encryption. In this paper we propose a probable approach of pretty good privacy for encrypt big data using both symmetric and asymmetric keys. Our goal is to achieve encrypt huge collection information and transmit it through a secure communication channel for committing the business and personal privacy. To justify our method an experimental dataset from three different platform is provided. We would like to show that our approach is working for massive size of various data efficiently and reliably.Keywords: big data, cloud computing, cryptography, hadoop, public key
Procedia PDF Downloads 32025003 Implementation of Big Data Concepts Led by the Business Pressures
Authors: Snezana Savoska, Blagoj Ristevski, Violeta Manevska, Zlatko Savoski, Ilija Jolevski
Abstract:
Big data is widely accepted by the pharmaceutical companies as a result of business demands create through legal pressure. Pharmaceutical companies have many legal demands as well as standards’ demands and have to adapt their procedures to the legislation. To manage with these demands, they have to standardize the usage of the current information technology and use the latest software tools. This paper highlights some important aspects of experience with big data projects implementation in a pharmaceutical Macedonian company. These projects made improvements of their business processes by the help of new software tools selected to comply with legal and business demands. They use IT as a strategic tool to obtain competitive advantage on the market and to reengineer the processes towards new Internet economy and quality demands. The company is required to manage vast amounts of structured as well as unstructured data. For these reasons, they implement projects for emerging and appropriate software tools which have to deal with big data concepts accepted in the company.Keywords: big data, unstructured data, SAP ERP, documentum
Procedia PDF Downloads 26925002 Saving Energy at a Wastewater Treatment Plant through Electrical and Production Data Analysis
Authors: Adriano Araujo Carvalho, Arturo Alatrista Corrales
Abstract:
This paper intends to show how electrical energy consumption and production data analysis were used to find opportunities to save energy at Taboada wastewater treatment plant in Callao, Peru. In order to access the data, it was used independent data networks for both electrical and process instruments, which were taken to analyze under an ISO 50001 energy audit, which considered, thus, Energy Performance Indexes for each process and a step-by-step guide presented in this text. Due to the use of aforementioned methodology and data mining techniques applied on information gathered through electronic multimeters (conveniently placed on substation switchboards connected to a cloud network), it was possible to identify thoroughly the performance of each process and thus, evidence saving opportunities which were previously hidden before. The data analysis brought both costs and energy reduction, allowing the plant to save significant resources and to be certified under ISO 50001.Keywords: energy and production data analysis, energy management, ISO 50001, wastewater treatment plant energy analysis
Procedia PDF Downloads 19225001 Spectroscopic Characterization Approach to Study Ablation Time on Zinc Oxide Nanoparticles Synthesis by Laser Ablation Technique
Authors: Suha I. Al-Nassar, K. M. Adel, F. Zainab
Abstract:
This work was devoted for producing ZnO nanoparticles by pulsed laser ablation (PLA) of Zn metal plate in the aqueous environment of cetyl trimethyl ammonium bromide (CTAB) using Q-Switched Nd:YAG pulsed laser with wavelength= 1064 nm, Rep. rate= 10 Hz, Pulse duration= 6 ns and laser energy 50 mJ. Solution of nanoparticles is found stable in the colloidal form for a long time. The effect of ablation time on the optical and structure of ZnO was studied is characterized by UV-visible absorption. UV-visible absorption spectrum has four peaks at 256, 259, 265, 322 nm for ablation time (5, 10, 15, and 20 sec) respectively, our results show that UV–vis spectra show a blue shift in the presence of CTAB with decrease the ablation time and blue shift indicated to get smaller size of nanoparticles. The blue shift in the absorption edge indicates the quantum confinement property of nanoparticles. Also, FTIR transmittance spectra of ZnO2 nanoparticles prepared in these states show a characteristic ZnO absorption at 435–445cm^−1.Keywords: zinc oxide nanoparticles, CTAB solution, pulsed laser ablation technique, spectroscopic characterization
Procedia PDF Downloads 37725000 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network
Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar
Abstract:
Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network
Procedia PDF Downloads 51624999 Missed Opportunities for Immunization of under Five Children in Calabar South County Cros River State, Nigeria, the Way Forward
Authors: Celestine Odigwe, Epoke Lincoln, Rhoda-Dara Ephraim
Abstract:
Background; Immunization against the childhood killer diseases is the cardinal strategy for the prevention of these diseases all over the world in under five children, these diseases include; Tuberculosis, Measles, Polio, Tetanus, Diphthria, Pertusis, Yellow Fever, Hepatitis B, Haemophilus Influenza type B. 6.9 million children die before their fifth birthday , 80% of the worlds death in children under 5 years occur in 25 countries most in Africa and Asia and 2 million children can be saved each year with routine immunization Therefore failure to achieve total immunization coverage puts several children at risk. Aim; The aim of the study was to ascertain the prevalence, Investigate the various reasons and causes why several under five children in a suburb of calabar municipal county fail to get the required immunizations as at and when due and possibly the consequences, so that efforts can be re-directed towards the solution of the problems so identified. Methods; the study was a community based cross sectional study. The respondents were the mothers/guardians of the sampled children who were all aged 0-59 months. To be eligible for recruitment into the study, the parent or guardian was required to give an informed consent, reside within the Calabar South County with his/her children aged 0-59 months. We calculated our sample size using the Leslie-Kish formula and we used a two-staged sampling method, first to ballot for the wards to be involved and then to select four of the most populated ones in the wards chosen. Data collection was by interviewer administered structured questionnaire (Appendix I), Data collected was entered and analyzed using Statistical Package for the Social Sciences (SPSS) Version 20. Percentages were calculated and represented using charts and tables Results; The number of children sampled was 159. We found that 150 were fully immunized and 9 were not, the prevalence of missed opportunity was 32% from the study. The reasons for missed opportunities were varied, ranging from false contraindications, logistical problems resulting in very poor access roads to health facilities and poor organization of health centers together with negative health worker attitudes. Some of the consequences of these missed opportunities were increased susceptibility to vaccine preventable diseases, resurgence of the above diseases and increased morbidity and mortality of children aged less than 5 years. Conclusion; We found that ignorance on the part of both parents/guardians and health care staff together with infrastructural inadequacies in the county such as- roads, poor electric power supply for storage of vaccines were hugely responsible for most missed opportunities for immunization. The details of these and suggestions for improvement and the way forward are discussed.Keywords: missed opportunity, immunization, under five, Calabar south
Procedia PDF Downloads 32424998 Review and Comparison of Associative Classification Data Mining Approaches
Authors: Suzan Wedyan
Abstract:
Data mining is one of the main phases in the Knowledge Discovery Database (KDD) which is responsible of finding hidden and useful knowledge from databases. There are many different tasks for data mining including regression, pattern recognition, clustering, classification, and association rule. In recent years a promising data mining approach called associative classification (AC) has been proposed, AC integrates classification and association rule discovery to build classification models (classifiers). This paper surveys and critically compares several AC algorithms with reference of the different procedures are used in each algorithm, such as rule learning, rule sorting, rule pruning, classifier building, and class allocation for test cases.Keywords: associative classification, classification, data mining, learning, rule ranking, rule pruning, prediction
Procedia PDF Downloads 53424997 Hierarchical Checkpoint Protocol in Data Grids
Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed
Abstract:
Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.Keywords: data grids, fault tolerance, clustering, chandy-lamport
Procedia PDF Downloads 34024996 An Observation of the Information Technology Research and Development Based on Article Data Mining: A Survey Study on Science Direct
Authors: Muhammet Dursun Kaya, Hasan Asil
Abstract:
One of the most important factors of research and development is the deep insight into the evolutions of scientific development. The state-of-the-art tools and instruments can considerably assist the researchers, and many of the world organizations have become aware of the advantages of data mining for the acquisition of the knowledge required for the unstructured data. This paper was an attempt to review the articles on the information technology published in the past five years with the aid of data mining. A clustering approach was used to study these articles, and the research results revealed that three topics, namely health, innovation, and information systems, have captured the special attention of the researchers.Keywords: information technology, data mining, scientific development, clustering
Procedia PDF Downloads 27624995 Security in Resource Constraints: Network Energy Efficient Encryption
Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy
Abstract:
Wireless nodes in a sensor network gather and process critical information designed to process and communicate, information flooding through such network is critical for decision making and data processing, the integrity of such data is one of the most critical factors in wireless security without compromising the processing and transmission capability of the network. This paper presents mechanism to securely transmit data over a chain of sensor nodes without compromising the throughput of the network utilizing available battery resources available at the sensor node.Keywords: hybrid protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node data processing, Z-MAC
Procedia PDF Downloads 14324994 Data Mining Techniques for Anti-Money Laundering
Authors: M. Sai Veerendra
Abstract:
Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project on developing a new data mining solution for AML Units in an international investment bank in Ireland, we survey recent data mining approaches for AML. In this paper, we present not only these approaches but also give an overview on the important factors in building data mining solutions for AML activities.Keywords: data mining, clustering, money laundering, anti-money laundering solutions
Procedia PDF Downloads 53524993 Effect of Carbon Nanotubes Functionalization with Nitrogen Groups on Pollutant Emissions in an Internal Combustion Engine
Authors: David Gamboa, Bernardo Herrera, Karen Cacua
Abstract:
Nanomaterials have been explored as alternatives to reduce particulate matter from diesel engines, which is one of the most common pollutants of the air in urban centers. However, the use of nanomaterials as additives for diesel has to overcome the instability of the dispersions to be considered viable for commercial use. In this work, functionalization of carbon nanotubes with amide groups was performed to improve the stability of these nanomaterials in a mix of 90% petroleum diesel and 10% palm oil biodiesel (B10) in concentrations of 50 and 100 ppm. The resulting nano fuel was used as the fuel for a stationary internal combustion engine, where the particulate matter, NOx, and CO were measured. The results showed that the use of amide groups significantly enhances the time for the carbon nanotubes to remain suspended in the fuel, and at the same time, these nanomaterials helped to reduce the particulate matter and NOx emissions. However, the CO emissions with nano fuel were higher than those ones with the combustion of B10. These results suggest that carbon nanotubes have thermal and catalytic effects on the combustion of B10.Keywords: carbon nanotubes, diesel, internal combustion engine, particulate matter
Procedia PDF Downloads 12624992 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data
Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee
Abstract:
Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.Keywords: data mining, evaluating new technology, technology opportunity, patent analysis
Procedia PDF Downloads 37424991 A Systematic Approach to Mitigate the Impact of Increased Temperature and Air Pollution in Urban Settings
Authors: Samain Sabrin, Joshua Pratt, Joshua Bryk, Maryam Karimi
Abstract:
Globally, extreme heat events have led to a surge in the number of heat-related moralities. These incidents are further exacerbated in high-density population centers due to the Urban Heat Island (UHI) effect. Varieties of anthropogenic activities such as unsupervised land surface modifications, expansion of impervious areas, and lack of use of vegetation are all contributors to an increase in the amount of heat flux trapped by an urban canopy which intensifies the UHI effect. This project aims to propose a systematic approach to measure the impact of air quality and increased temperature based on urban morphology in the selected metropolitan cities. This project will measure the impact of build environment for urban and regional planning using human biometeorological evaluations (mean radiant temperature, Tmrt). We utilized the Rayman model (capable of calculating short and long wave radiation fluxes affecting the human body) to estimate the Tmrt in an urban environment incorporating location and height of buildings and trees as a supplemental tool in urban planning, and street design. Our current results suggest a strong correlation between building height and increased surface temperature in megacities. This model will help with; 1. Quantify the impacts of the built environment and surface properties on surrounding temperature, 2. Identify priority urban neighborhoods by analyzing Tmrt and air quality data at pedestrian level, 3. Characterizing the need for urban green infrastructure or better urban planning- maximizing the cooling benefit from existing Urban Green Infrastructure (UGI), and 4. Developing a hierarchy of streets for new UGI integration and propose new UGI based on site characteristics and cooling potential.Keywords: air quality, heat mitigation, human-biometeorological indices, increased temperature, mean radiant temperature, radiation flux, sustainable development, thermal comfort, urban canopy, urban planning
Procedia PDF Downloads 14024990 Anomaly Detection Based on System Log Data
Authors: M. Kamel, A. Hoayek, M. Batton-Hubert
Abstract:
With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.Keywords: logs, anomaly detection, ML, scoring, NLP
Procedia PDF Downloads 9324989 The Emotional Implication of the Phraseological Fund Applied in Cognitive Business Negotiation
Authors: Kristine Dzagnidze
Abstract:
The paper equally centers on both the structural and cognitive linguistics in light of phraseologism and its emotional implication. Accordingly, the methods elaborated within the framework of both the systematic-structural and linguo-cognitive theories are identically relevant to the research of mine. In other words, through studying the negotiation process, our attention is drawn upon defining negotiations’ peculiarities, emotion, style and specifics of cognition, motives, aims, contextual characterizations and the quality of cultural context and integration. Besides, the totality of the concepts and methods is also referred to, which is connected with the stage of the development of the emotional linguistic thinking. The latter contextually correlates with the dominance of anthropocentric–communicative paradigm. The synthesis of structuralistic and cognitive perspectives has turned out to be relevant to our research, carried out in the form of intellectual action, that is, on the one hand, the adequacy of the research purpose to the expected results. On the other hand, the validity of methodology for formulating the objective conclusions needed for emotional connotation beyond phraseologism. The mechanism mentioned does not make a claim about a discovery of a new truth. Though, it gives the possibility of a novel interpretation of the content in existence.Keywords: cognitivism, communication, implication, negotiation
Procedia PDF Downloads 26324988 EnumTree: An Enumerative Biclustering Algorithm for DNA Microarray Data
Authors: Haifa Ben Saber, Mourad Elloumi
Abstract:
In a number of domains, like in DNA microarray data analysis, we need to cluster simultaneously rows (genes) and columns (conditions) of a data matrix to identify groups of constant rows with a group of columns. This kind of clustering is called biclustering. Biclustering algorithms are extensively used in DNA microarray data analysis. More effective biclustering algorithms are highly desirable and needed. We introduce a new algorithm called, Enumerative tree (EnumTree) for biclustering of binary microarray data. is an algorithm adopting the approach of enumerating biclusters. This algorithm extracts all biclusters consistent good quality. The main idea of EnumLat is the construction of a new tree structure to represent adequately different biclusters discovered during the process of enumeration. This algorithm adopts the strategy of all biclusters at a time. The performance of the proposed algorithm is assessed using both synthetic and real DNA micryarray data, our algorithm outperforms other biclustering algorithms for binary microarray data. Biclusters with different numbers of rows. Moreover, we test the biological significance using a gene annotation web tool to show that our proposed method is able to produce biologically relevent biclusters.Keywords: DNA microarray, biclustering, gene expression data, tree, datamining.
Procedia PDF Downloads 36924987 The Impact of Financial Reporting on Sustainability
Authors: Lynn Ruggieri
Abstract:
The worldwide pandemic has only increased sustainability awareness. The public is demanding that businesses be held accountable for their impact on the environment. While financial data enjoys uniformity in reporting requirements, there are no uniform reporting requirements for non-financial data. Europe is leading the way with some standards being implemented for reporting non-financial sustainability data; however, there is no uniformity globally. And without uniformity, there is not a clear understanding of what information to include and how to disclose it. Sustainability reporting will provide important information to stakeholders and will enable businesses to understand their impact on the environment. Therefore, there is a crucial need for this data. This paper looks at the history of sustainability reporting in the countries of the European Union and throughout the world and makes a case for worldwide reporting requirements for sustainability.Keywords: financial reporting, non-financial data, sustainability, global financial reporting
Procedia PDF Downloads 17724986 Effect of Ti+ Irradiation on the Photoluminescence of TiO2 Nanofibers
Authors: L. Chetibi, D. Hamana, T. O. Busko, M. P. Kulish, S. Achour
Abstract:
TiO2 nanostructures have attracted much attention due to their optical, dielectric and photocatalytic properties as well as applications including optical coating, photocatalysis and photoelectrochemical solar cells. This work aims to prepare TiO2 nanofibers (NFs) on titanium substrate (Ti) by in situ oxidation of Ti foils in a mixture solution of concentrated H2O2 and NaOH followed by proton exchange and calcinations. Scanning Electron microscopy (SEM) revealed an obvious network of TiO2 nanofibers. The photoluminescence (PL) spectra of these nanostructures revealed a broad intense band in the visible light range with a reduced near edge band emission. The PL bands in the visible region, mainly, results from surface oxygen vacancies and others defects. After irradiation with Ti+ ions (the irradiation energy was E = 140 keV with doses of 1013 ions/cm2), the intensity of the PL spectrum decreased as a consequence of the radiation treatment. The irradiation with Ti+ leads to a reduction of defects and generation of non irradiative defects near to the level of the conduction band as evidenced by the PL results. On the other hand, reducing the surface defects on TiO2 nanostructures may improve photocatalytic and optoelectronic properties of this nanostructure.Keywords: TiO2, nanofibers, photoluminescence, irradiation
Procedia PDF Downloads 24324985 Thai Student Ability on Speexx Language Training Program
Authors: Toby Gibbs, Glen Craigie, Suwaree Yordchim
Abstract:
Using the Speexx Online Language Training Program with Thai students has allowed us to evaluate their learning comprehension and track their progression through the English language program. Speexx sets the standard for excellence and innovation in web-based language training and online coaching services. The program is designed to improve the business communication skills of language learners for Thai students. Speexx consists of English lessons, exercises, tests, web boards, and supplementary lessons to help students practice English. The sample groups are 191 Thai sophomores studying Business English with the department of Humanities and Social Science. The data was received by standard deviation (S.D.) value from questionnaires and samples provided from the Speexx training program. The results found that most Thai sophomores fail the Speexx training program due to their learning comprehension of the English language is below average. With persisted efforts on new training methods, the success of the Speexx Language Training Program can break through the cultural barriers and help future students adopt English as a second language. The Speexx results revealed four main factors affecting the success as follows: 1) Future English training should be pursued in applied Speexx development. 2) Thai students didn’t see the benefit of having an Online Language Training Program. 3) There is a great need to educate the next generation of learners on the benefits of Speexx within the community. 4) A great majority of Thai Sophomores didn't know what Speexx was. A guideline for self-reliance planning consisted of four aspects: 1) Development planning: by arranging groups to further improve English abilities with the Speexx Language Training program and encourage using Speexx every day. Local communities need to develop awareness of the usefulness of Speexx and share the value of using the program among family and friends. 2) Humanities and Social Science staff should develop skills using this Online Language Training Program to expand on the benefits of Speexx within their departments. 3) Further research should be pursued on the Thai Students progression with Speexx and how it helps them improve their language skills with Business English. 4) University’s and Language centers should focus on using Speexx to encourage learning for any language, not just English.Keywords: ability, comprehension, sophomore, speexx
Procedia PDF Downloads 36824984 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies
Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk
Abstract:
Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)
Procedia PDF Downloads 9224983 The Economic Geology of Ijero Ekiti, South Western Nigeria: A Need for Sustainable Mining for a Responsible Socio-Economic Growth and Development
Authors: Olagunju John Olusesan-Remi
Abstract:
The study area Ijero-Ekiti falls within the Ilesha-Ekiti Schist belt, originating from the long year of the Pan-Africa orogenic events and various cataclysmic tectonic activities in history. Ijero-Ekiti is situated within latitude 7 degree 45N and 7 Degree 55N. Ijero Ekiti is bordered between the Dahomean Basin and the southern Bida/Benue basin on the Geological map of Nigeria. This research work centers on majorly on investigating the chemical composition and as well as the mineralogical distribution of the various mineral-bearing rocks that composed the study area. This work is essentially carried out with a view to assessing and at the same time ascertaining the economic potentials and or the industrial significance of the area to Ekiti-south western region and the Nigeria nation as a whole. The mineralogical distribution pattern is of particular interest to us in this study. In this regard essential focus is put on the mostly the economic gemstones distributions within the various mineral bearing rocks in the zone, some of which includes the tourmaline formation, cassiterite deposit, tin-ore, tantalum columbite, smoky quartz, amethyst, polychrome and emerald variety beryl among others as they occurred within the older granite of the Precambrian rocks. To this end, samples of the major rock types were taken from various locations within the study area for detail scientific analysis as follows: The Igemo pegmatite of Ijero west, the epidiorite of Idaho, the biotitic hornblende gneiss of Ikoro-Ijero north and the beryl crystalline rock types to mention a few. The slides of the each rock from the aforementioned zones were later prepared and viewed under a cross Nichol petro graphic microscope with a particular focus on the light reflection ability of the constituent minerals in each rock samples. The results from the physical analysis viewed from the colour had it that the pegmatite samples ranges from pure milky white to fairly pinkish coloration. Other physical properties investigated include the streak, luster, form, specific gravity, cleavage/fracture pattern etc. The optical examination carried out centers on the refractive indices and pleochroism of the minerals present while the chemical analysis reveals from the tourmaline samples a differing correlation coefficient of the various oxides in each samples collected through which the mineral presence was established. In conclusion, it was inferred that the various minerals outlined above were in reasonable quantity within the Ijero area. With the above discoveries, therefore, we strongly recommend a detailed scientific investigation to be carried out such that will lead to a comprehensive mining of the area. Above all, it is our conclusion that a comprehensive mineralogical exploitation of this area will not only boost the socio-economic potential of the area but at the same time will go a long way contributing immensely to the socio-economic growth and development of the Nation-Nigeria at large.Keywords: Ijero Ekiti, Southwestern Nigeria, economic minerals, pegmatite of the pan African origin, cataclastic tectonic activities, Ilesha Schistbelt, precambrian formations
Procedia PDF Downloads 25424982 Mapping Tunnelling Parameters for Global Optimization in Big Data via Dye Laser Simulation
Authors: Sahil Imtiyaz
Abstract:
One of the biggest challenges has emerged from the ever-expanding, dynamic, and instantaneously changing space-Big Data; and to find a data point and inherit wisdom to this space is a hard task. In this paper, we reduce the space of big data in Hamiltonian formalism that is in concordance with Ising Model. For this formulation, we simulate the system using dye laser in FORTRAN and analyse the dynamics of the data point in energy well of rhodium atom. After mapping the photon intensity and pulse width with energy and potential we concluded that as we increase the energy there is also increase in probability of tunnelling up to some point and then it starts decreasing and then shows a randomizing behaviour. It is due to decoherence with the environment and hence there is a loss of ‘quantumness’. This interprets the efficiency parameter and the extent of quantum evolution. The results are strongly encouraging in favour of the use of ‘Topological Property’ as a source of information instead of the qubit.Keywords: big data, optimization, quantum evolution, hamiltonian, dye laser, fermionic computations
Procedia PDF Downloads 19224981 Trauma Scores and Outcome Prediction After Chest Trauma
Authors: Mohamed Abo El Nasr, Mohamed Shoeib, Abdelhamid Abdelkhalik, Amro Serag
Abstract:
Background: Early assessment of severity of chest trauma, either blunt or penetrating is of critical importance in prediction of patient outcome. Different trauma scoring systems are widely available and are based on anatomical or physiological parameters to expect patient morbidity or mortality. Up till now, there is no ideal, universally accepted trauma score that could be applied in all trauma centers and is suitable for assessment of severity of chest trauma patients. Aim: Our aim was to compare various trauma scoring systems regarding their predictability of morbidity and mortality in chest trauma patients. Patients and Methods: This study was a prospective study including 400 patients with chest trauma who were managed at Tanta University Emergency Hospital, Egypt during a period of 2 years (March 2014 until March 2016). The patients were divided into 2 groups according to the mode of trauma: blunt or penetrating. The collected data included age, sex, hemodynamic status on admission, intrathoracic injuries, and associated extra-thoracic injuries. The patients outcome including mortality, need of thoracotomy, need for ICU admission, need for mechanical ventilation, length of hospital stay and the development of acute respiratory distress syndrome were also recorded. The relevant data were used to calculate the following trauma scores: 1. Anatomical scores including abbreviated injury scale (AIS), Injury severity score (ISS), New injury severity score (NISS) and Chest wall injury scale (CWIS). 2. Physiological scores including revised trauma score (RTS), Acute physiology and chronic health evaluation II (APACHE II) score. 3. Combined score including Trauma and injury severity score (TRISS ) and 4. Chest-Specific score Thoracic trauma severity score (TTSS). All these scores were analyzed statistically to detect their sensitivity, specificity and compared regarding their predictive power of mortality and morbidity in blunt and penetrating chest trauma patients. Results: The incidence of mortality was 3.75% (15/400). Eleven patients (11/230) died in blunt chest trauma group, while (4/170) patients died in penetrating trauma group. The mortality rate increased more than three folds to reach 13% (13/100) in patients with severe chest trauma (ISS of >16). The physiological scores APACHE II and RTS had the highest predictive value for mortality in both blunt and penetrating chest injuries. The physiological score APACHE II followed by the combined score TRISS were more predictive for intensive care admission in penetrating injuries while RTS was more predictive in blunt trauma. Also, RTS had a higher predictive value for expectation of need for mechanical ventilation followed by the combined score TRISS. APACHE II score was more predictive for the need of thoracotomy in penetrating injuries and the Chest-Specific score TTSS was higher in blunt injuries. The anatomical score ISS and TTSS score were more predictive for prolonged hospital stay in penetrating and blunt injuries respectively. Conclusion: Trauma scores including physiological parameters have a higher predictive power for mortality in both blunt and penetrating chest trauma. They are more suitable for assessment of injury severity and prediction of patients outcome.Keywords: chest trauma, trauma scores, blunt injuries, penetrating injuries
Procedia PDF Downloads 42024980 Applying Different Stenography Techniques in Cloud Computing Technology to Improve Cloud Data Privacy and Security Issues
Authors: Muhammad Muhammad Suleiman
Abstract:
Cloud Computing is a versatile concept that refers to a service that allows users to outsource their data without having to worry about local storage issues. However, the most pressing issues to be addressed are maintaining a secure and reliable data repository rather than relying on untrustworthy service providers. In this study, we look at how stenography approaches and collaboration with Digital Watermarking can greatly improve the system's effectiveness and data security when used for Cloud Computing. The main requirement of such frameworks, where data is transferred or exchanged between servers and users, is safe data management in cloud environments. Steganography is the cloud is among the most effective methods for safe communication. Steganography is a method of writing coded messages in such a way that only the sender and recipient can safely interpret and display the information hidden in the communication channel. This study presents a new text steganography method for hiding a loaded hidden English text file in a cover English text file to ensure data protection in cloud computing. Data protection, data hiding capability, and time were all improved using the proposed technique.Keywords: cloud computing, steganography, information hiding, cloud storage, security
Procedia PDF Downloads 19024979 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics
Authors: Farhad Asadi, Mohammad Javad Mollakazemi
Abstract:
In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm
Procedia PDF Downloads 42424978 Mechanical Properties of Selective Laser Sintered 304L Stainless Steel Powders
Authors: Shijie Liu, Jehnming Lin
Abstract:
This study mainly discussed the mechanical properties of selective laser sintered 304L stainless steel powder specimen. According to a single layer specimen sintering, the microstructure and porosity were observed to find out the proper sintering parameters. A multi-layer sintering experiment was conducted. Based on the microstructure and the integration between layers, the suitable parameters were found out. Finally, the sintered specimens were examined by metallographical inspection, hardness test, tensile test, and surface morphology measurement. The structure of the molten powder coated with unmelted powder was found in metallographic test. The hardness of the sintered stainless steel powder is greater than the raw material. The tensile strength is less than the raw material, and it is corresponding to different scanning paths. The specimen will have different patterns of cracking. It was found that the helical scanning path specimen will have a warpage deformation at the edge of the specimen. The S-scan path specimen surface is relatively flat.Keywords: laser sintering, sintering path, microstructure, mechanical properties
Procedia PDF Downloads 15924977 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining
Authors: İbrahi̇m Kara, Seher Arslankaya
Abstract:
Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.Keywords: data mining, decision support systems, heart attack, health sector
Procedia PDF Downloads 35524976 Bayesian Borrowing Methods for Count Data: Analysis of Incontinence Episodes in Patients with Overactive Bladder
Authors: Akalu Banbeta, Emmanuel Lesaffre, Reynaldo Martina, Joost Van Rosmalen
Abstract:
Including data from previous studies (historical data) in the analysis of the current study may reduce the sample size requirement and/or increase the power of analysis. The most common example is incorporating historical control data in the analysis of a current clinical trial. However, this only applies when the historical control dataare similar enough to the current control data. Recently, several Bayesian approaches for incorporating historical data have been proposed, such as the meta-analytic-predictive (MAP) prior and the modified power prior (MPP) both for single control as well as for multiple historical control arms. Here, we examine the performance of the MAP and the MPP approaches for the analysis of (over-dispersed) count data. To this end, we propose a computational method for the MPP approach for the Poisson and the negative binomial models. We conducted an extensive simulation study to assess the performance of Bayesian approaches. Additionally, we illustrate our approaches on an overactive bladder data set. For similar data across the control arms, the MPP approach outperformed the MAP approach with respect to thestatistical power. When the means across the control arms are different, the MPP yielded a slightly inflated type I error (TIE) rate, whereas the MAP did not. In contrast, when the dispersion parameters are different, the MAP gave an inflated TIE rate, whereas the MPP did not.We conclude that the MPP approach is more promising than the MAP approach for incorporating historical count data.Keywords: count data, meta-analytic prior, negative binomial, poisson
Procedia PDF Downloads 11624975 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis
Authors: John Gaber
Abstract:
Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)
Procedia PDF Downloads 483