Search results for: violation data discovery
25202 Exposure to Social Media Shared Video-Clips on Irregularities from the 2023 Election in Nigeria and Audience Perception of the Outcome
Authors: Obiakor Casmir Uchenna, Ikegbunam Peter Chierike, Ezeja Perpetual Chisom
Abstract:
Irregularities have been a major feature of the Nigerian political activities since 1999. The rate at which such impunities thrive in the country has made elections grossly unacceptable among the people because the outcomes have never reflected the wish of the masses. Conscious of this, citizens have subscribed to the use of social media in exposing the ugly faces of the country’s elections which have always been against the less privileged. This study is an exploration of the relationship between exposure to social media shared video-clips and the respondents’ perception of the 2023 presidential election in Nigeria. The general objective of the study is to find out what the respondents make of the election as a result of the video-clips shared on different social media platforms showing electoral irregularities. The study adopted survey research method in studying 378 university undergraduates from NAU, COOU and Paul University selected through purposive sampling technique. The study was premised on the theoretical provision of violation of expectation theory. Findings revealed that the respondents are well exposed to different video-clips showing irregularities on the election. It was also found that the respondents have negative perception of the election. It was concluded that electoral umpire, the government in power and the security apparatus violated the respondents’ expectation from the election based on the pre-election promises made to the citizens. It was recommended among others, that Nigeria must strengthen the various institutions responsible for the conduct of elections if violence will not be made the best option for the poor masses.Keywords: social media shared video-clips, exposure, irregularities, elections, audience perception, outcome
Procedia PDF Downloads 6225201 How to Talk about It without Talking about It: Cognitive Processing Therapy Offers Trauma Symptom Relief without Violating Cultural Norms
Authors: Anne Giles
Abstract:
Humans naturally wish they could forget traumatic experiences. To help prevent future harm, however, the human brain has evolved to retain data about experiences of threat, alarm, or violation. When given compassionate support and assistance with thinking helpfully and realistically about traumatic events, most people can adjust to experiencing hardships, albeit with residual sad, unfortunate memories. Persistent, recurrent, intrusive memories, difficulty sleeping, emotion dysregulation, and avoidance of reminders, however, may be symptoms of Post-traumatic Stress Disorder (PTSD). Brain scans show that PTSD affects brain functioning. We currently have no physical means of restoring the system of brain structures and functions involved with PTSD. Medications may ease some symptoms but not others. However, forms of "talk therapy" with cognitive components have been found by researchers to reduce, even resolve, a broad spectrum of trauma symptoms. Many cultures have taboos against talking about hardships. Individuals may present themselves to mental health care professionals with severe, disabling trauma symptoms but, because of cultural norms, be unable to speak about them. In China, for example, relationship expectations may include the belief, "Bad things happening in the family should stay in the family (jiāchǒu bùkě wàiyán 家丑不可外扬)." The concept of "family (jiā 家)" may include partnerships, close and extended families, communities, companies, and the nation itself. In contrast to many trauma therapies, Cognitive Processing Therapy (CPT) for Post-traumatic Stress Disorder asks its participants to focus not on "what" happened but on "why" they think the trauma(s) occurred. The question "why" activates and exercises cognitive functioning. Brain scans of individuals with PTSD reveal executive functioning portions of the brain inadequately active, with emotion centers overly active. CPT conceptualizes PTSD as a network of cognitive distortions that keep an individual "stuck" in this under-functioning and over-functioning dynamic. Through asking participants forms of the question "why," plus offering a protocol for examining answers and relinquishing unhelpful beliefs, CPT assists individuals in consciously reactivating the cognitive, executive functions of their brains, thus restoring normal functioning and reducing distressing trauma symptoms. The culturally sensitive components of CPT that allow people to "talk about it without talking about it" may offer the possibility for worldwide relief from symptoms of trauma.Keywords: cognitive processing therapy (CPT), cultural norms, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 21525200 The Perspective on Data Collection Instruments for Younger Learners
Authors: Hatice Kübra Koç
Abstract:
For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners
Procedia PDF Downloads 9425199 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 3425198 Screening of New Antimicrobial Agents from Heterocyclic Derivatives
Authors: W. Mazari, K. Boucherit, Z. Boucherit-Otmani, M. N. Rahmoun, M. Benabdallah
Abstract:
The hospital or any other establishment of care can be considered as an ecosystem where the patient comes into contact with a frightening microbial universe and a risk to contract infection that is referred to as nosocomial or health care-associated. In these last years, the incidence of these infections has risen sharply. Several microorganisms are the cause of these nosocomial infections and the emergence of resistance of the microbial strains against antibiotics creates a danger to public health. The search for new antimicrobial agents to overcome this problem has produced interesting compounds through chemical synthesis, which plays a very important role in the research and discovery of new drugs. It is in this framework that our study was conducted at our laboratory and it involves evaluating the antibacterial activity of thirteen 2-pyridone derivatives synthesized by two methods, the diffusion disc method and the dilution method against eight Gram negative bacterial strains. The results seem interesting especially for two products that have shown the best activities against Escherichia coli ATCC 25922 and Enterobacter cloacae ATCC 13047 with CMI of 512µg/ml.Keywords: heterocyclic derivatives, chemical synthesis, antimicrobial activity, biotechnology
Procedia PDF Downloads 36925197 Generation of Quasi-Measurement Data for On-Line Process Data Analysis
Authors: Hyun-Woo Cho
Abstract:
For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.Keywords: data analysis, diagnosis, monitoring, process data, quality control
Procedia PDF Downloads 48325196 Cotton Fiber Quality Improvement by Introducing Sucrose Synthase (SuS) Gene into Gossypium hirsutum L.
Authors: Ahmad Ali Shahid, Mukhtar Ahmed
Abstract:
The demand for long staple fiber having better strength and length is increasing with the introduction of modern spinning and weaving industry in Pakistan. Work on gene discovery from developing cotton fibers has helped to identify dozens of genes that take part in cotton fiber development and several genes have been characterized for their role in fiber development. Sucrose synthase (SuS) is a key enzyme in the metabolism of sucrose in a plant cell, in cotton fiber it catalyzes a reversible reaction, but preferentially converts sucrose and UDP into fructose and UDP-glucose. UDP-glucose (UDPG) is a nucleotide sugar act as a donor for glucose residue in many glycosylation reactions and is essential for the cytosolic formation of sucrose and involved in the synthesis of cell wall cellulose. The study was focused on successful Agrobacterium-mediated stable transformation of SuS gene in pCAMBIA 1301 into cotton under a CaMV35S promoter. Integration and expression of the gene were confirmed by PCR, GUS assay, and real-time PCR. Young leaves of SuS overexpressing lines showed increased total soluble sugars and plant biomass as compared to non-transgenic control plants. Cellulose contents from fiber were significantly increased. SEM analysis revealed that fibers from transgenic cotton were highly spiral and fiber twist number increased per unit length when compared with control. Morphological data from field plants showed that transgenic plants performed better in field conditions. Incorporation of genes related to cotton fiber length and quality can provide new avenues for fiber improvement. The utilization of this technology would provide an efficient import substitution and sustained production of long-staple fiber in Pakistan to fulfill the industrial requirements.Keywords: agrobacterium-mediated transformation, cotton fiber, sucrose synthase gene, staple length
Procedia PDF Downloads 23525195 The Creative Unfolding of “Reduced Descriptive Structures” in Musical Cognition: Technical and Theoretical Insights Based on the OpenMusic and PWGL Long-Term Feedback
Authors: Jacopo Baboni Schilingi
Abstract:
We here describe the theoretical and philosophical understanding of a long term use and development of algorithmic computer-based tools applied to music composition. The findings of our research lead us to interrogate some specific processes and systems of communication engaged in the discovery of specific cultural artworks: artistic creation in the sono-musical domain. Our hypothesis is that the patterns of auditory learning cannot be only understood in terms of social transmission but would gain to be questioned in the way they rely on various ranges of acoustic stimuli modes of consciousness and how the different types of memories engaged in the percept-action expressive systems of our cultural communities also relies on these shadowy conscious entities we named “Reduced Descriptive Structures”.Keywords: algorithmic sonic computation, corrected and self-correcting learning patterns in acoustic perception, morphological derivations in sensorial patterns, social unconscious modes of communication
Procedia PDF Downloads 15725194 Emerging Technology for Business Intelligence Applications
Authors: Hsien-Tsen Wang
Abstract:
Business Intelligence (BI) has long helped organizations make informed decisions based on data-driven insights and gain competitive advantages in the marketplace. In the past two decades, businesses witnessed not only the dramatically increasing volume and heterogeneity of business data but also the emergence of new technologies, such as Artificial Intelligence (AI), Semantic Web (SW), Cloud Computing, and Big Data. It is plausible that the convergence of these technologies would bring more value out of business data by establishing linked data frameworks and connecting in ways that enable advanced analytics and improved data utilization. In this paper, we first review and summarize current BI applications and methodology. Emerging technologies that can be integrated into BI applications are then discussed. Finally, we conclude with a proposed synergy framework that aims at achieving a more flexible, scalable, and intelligent BI solution.Keywords: business intelligence, artificial intelligence, semantic web, big data, cloud computing
Procedia PDF Downloads 9825193 Using Equipment Telemetry Data for Condition-Based maintenance decisions
Authors: John Q. Todd
Abstract:
Given that modern equipment can provide comprehensive health, status, and error condition data via built-in sensors, maintenance organizations have a new and valuable source of insight to take advantage of. This presentation will expose what these data payloads might look like and how they can be filtered, visualized, calculated into metrics, used for machine learning, and generate alerts for further action.Keywords: condition based maintenance, equipment data, metrics, alerts
Procedia PDF Downloads 18925192 Education and Development: An Overview of Islam
Authors: Rasheed Sanusi Adeleke
Abstract:
Several attempts have been made by scholars, both medieval and contemporary on the impact of Islam on scientific discovery. Lesser attention, however, is always accorded to the historical antecedents of the earlier Muslim scholars, who made frantic efforts towards the discoveries. Islam as a divine religion places high premium on the acquisition of knowledge especially that of sciences. It considers knowledge as a comprehensive whole, which covers both spiritual and material aspects of human life. Islam torches every aspect of human life for the growth, development and advancement of society. Acquisition of knowledge of humanity, social sciences as well as the pure and applied sciences is comprehensively expressed in Islamic education. Not only this, the history portrays the leading indelible roles played by the early Muslims on these various fields of knowledge. That is why Islam has declared acquisition of knowledge compulsory for all Muslims. This paper therefore analyses the contributions of Islam to civilization with particular reference to sciences. It also affirms that Islam is beyond the religion of prayers and rituals. The work is historic, analytic and explorative in nature. Recommendations are also also put forward as suggestions for the present generation cum posterity in general and Muslims in particular.Keywords: education, development, Islam, development and Islam
Procedia PDF Downloads 43625191 Ethical Artificial Intelligence: An Exploratory Study of Guidelines
Authors: Ahmad Haidar
Abstract:
The rapid adoption of Artificial Intelligence (AI) technology holds unforeseen risks like privacy violation, unemployment, and algorithmic bias, triggering research institutions, governments, and companies to develop principles of AI ethics. The extensive and diverse literature on AI lacks an analysis of the evolution of principles developed in recent years. There are two fundamental purposes of this paper. The first is to provide insights into how the principles of AI ethics have been changed recently, including concepts like risk management and public participation. In doing so, a NOISE (Needs, Opportunities, Improvements, Strengths, & Exceptions) analysis will be presented. Second, offering a framework for building Ethical AI linked to sustainability. This research adopts an explorative approach, more specifically, an inductive approach to address the theoretical gap. Consequently, this paper tracks the different efforts to have “trustworthy AI” and “ethical AI,” concluding a list of 12 documents released from 2017 to 2022. The analysis of this list unifies the different approaches toward trustworthy AI in two steps. First, splitting the principles into two categories, technical and net benefit, and second, testing the frequency of each principle, providing the different technical principles that may be useful for stakeholders considering the lifecycle of AI, or what is known as sustainable AI. Sustainable AI is the third wave of AI ethics and a movement to drive change throughout the entire lifecycle of AI products (i.e., idea generation, training, re-tuning, implementation, and governance) in the direction of greater ecological integrity and social fairness. In this vein, results suggest transparency, privacy, fairness, safety, autonomy, and accountability as recommended technical principles to include in the lifecycle of AI. Another contribution is to capture the different basis that aid the process of AI for sustainability (e.g., towards sustainable development goals). The results indicate data governance, do no harm, human well-being, and risk management as crucial AI for sustainability principles. This study’s last contribution clarifies how the principles evolved. To illustrate, in 2018, the Montreal declaration mentioned eight principles well-being, autonomy, privacy, solidarity, democratic participation, equity, and diversity. In 2021, notions emerged from the European Commission proposal, including public trust, public participation, scientific integrity, risk assessment, flexibility, benefit and cost, and interagency coordination. The study design will strengthen the validity of previous studies. Yet, we advance knowledge in trustworthy AI by considering recent documents, linking principles with sustainable AI and AI for sustainability, and shedding light on the evolution of guidelines over time.Keywords: artificial intelligence, AI for sustainability, declarations, framework, regulations, risks, sustainable AI
Procedia PDF Downloads 9625190 Ethics Can Enable Open Source Data Research
Authors: Dragana Calic
Abstract:
The openness, availability and the sheer volume of big data have provided, what some regard as, an invaluable and rich dataset. Researchers, businesses, advertising agencies, medical institutions, to name only a few, collect, share, and analyze this data to enable their processes and decision making. However, there are important ethical considerations associated with the use of big data. The rapidly evolving nature of online technologies has overtaken the many legislative, privacy, and ethical frameworks and principles that exist. For example, should we obtain consent to use people’s online data, and under what circumstances can privacy considerations be overridden? Current guidance on how to appropriately and ethically handle big data is inconsistent. Consequently, this paper focuses on two quite distinct but related ethical considerations that are at the core of the use of big data for research purposes. They include empowering the producers of data and empowering researchers who want to study big data. The first consideration focuses on informed consent which is at the core of empowering producers of data. In this paper, we discuss some of the complexities associated with informed consent and consider studies of producers’ perceptions to inform research ethics guidelines and practice. The second consideration focuses on the researcher. Similarly, we explore studies that focus on researchers’ perceptions and experiences.Keywords: big data, ethics, producers’ perceptions, researchers’ perceptions
Procedia PDF Downloads 28625189 The Emotional Implication of the Phraseological Fund Applied in Cognitive Business Negotiation
Authors: Kristine Dzagnidze
Abstract:
The paper equally centers on both the structural and cognitive linguistics in light of phraseologism and its emotional implication. Accordingly, the methods elaborated within the framework of both the systematic-structural and linguo-cognitive theories are identically relevant to the research of mine. In other words, through studying the negotiation process, our attention is drawn upon defining negotiations’ peculiarities, emotion, style and specifics of cognition, motives, aims, contextual characterizations and the quality of cultural context and integration. Besides, the totality of the concepts and methods is also referred to, which is connected with the stage of the development of the emotional linguistic thinking. The latter contextually correlates with the dominance of anthropocentric–communicative paradigm. The synthesis of structuralistic and cognitive perspectives has turned out to be relevant to our research, carried out in the form of intellectual action, that is, on the one hand, the adequacy of the research purpose to the expected results. On the other hand, the validity of methodology for formulating the objective conclusions needed for emotional connotation beyond phraseologism. The mechanism mentioned does not make a claim about a discovery of a new truth. Though, it gives the possibility of a novel interpretation of the content in existence.Keywords: cognitivism, communication, implication, negotiation
Procedia PDF Downloads 26425188 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation
Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das
Abstract:
Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).Keywords: clipping, compression, resolution, seismic scaling
Procedia PDF Downloads 47125187 Matrix Method Posting
Authors: Varong Pongsai
Abstract:
The objective of this paper is introducing a new method of accounting posting which is called Matrix Method Posting. This method is based on the Matrix operation of pure Mathematics. Although, accounting field is classified as one of the social-science knowledge, many of accounting operations are placed by Mathematics sign and operation. Through the operation applying, it seems to be that the operations of Mathematics should be applied to accounting possibly. So, this paper tries to over-lap Mathematics logic to accounting logic smoothly. According to the context of discovery, deductive approach is employed to prove a simultaneously logical concept of both Mathematics and Accounting. The result proves that the Matrix can be placed to operate accounting perfectly, because Matrix and accounting logic also have a similarity concept which is balancing 2 sides during operations. Moreover, the Matrix posting also has a lot of benefit. It can help financial analyst calculating financial ratios comfortably. Furthermore, the matrix determinant which is a signature operation itself also helps auditors checking out the correction of clients’ recording. If the determinant is not equaled to 0, it will point out that the recording process of clients getting into the problem. Finally, the Matrix should be easily determining a concept of merger and consolidation far beyond the present day concept.Keywords: matrix method posting, deductive approach, determinant, accounting application
Procedia PDF Downloads 36725186 Association of Social Data as a Tool to Support Government Decision Making
Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias
Abstract:
Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.Keywords: social data, government decision making, association of social data, data mining
Procedia PDF Downloads 37125185 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 2025184 Genetic Diversity and Discovery of Unique SNPs in Five Country Cultivars of Sesamum indicum by Next-Generation Sequencing
Authors: Nam-Kuk Kim, Jin Kim, Soomin Park, Changhee Lee, Mijin Chu, Seong-Hun Lee
Abstract:
In this study, we conducted whole genome re-sequencing of 10 cultivars originated from five countries including Korea, China, India, Pakistan and Ethiopia with Sesamum indicum (Zhongzho No. 13) genome as a reference. Almost 80% of the whole genome sequences of the reference genome could be covered by sequenced reads. Numerous SNP and InDel were detected by bioinformatic analysis. Among these variants, 266,051 SNPs were identified as unique to countries. Pakistan and Ethiopia had high densities of SNPs compared to other countries. Three main clusters (cluster 1: Korea, cluster 2: Pakistan and India, cluster 3: Ethiopia and China) were recovered by neighbor-joining analysis using all variants. Interestingly, some variants were detected in DGAT1 (diacylglycerol O-acyltransferase 1) and FADS (fatty acid desaturase) genes, which are known to be related with fatty acid synthesis and metabolism. These results can provide useful information to understand the regional characteristics and develop DNA markers for origin discrimination of sesame.Keywords: Sesamum indicum, NGS, SNP, DNA marker
Procedia PDF Downloads 32825183 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform
Authors: Sadam Alwadi
Abstract:
Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.Keywords: outlier values, imputation, stock market data, detecting, estimation
Procedia PDF Downloads 8225182 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage
Authors: P. Jayashree, S. Rajkumar
Abstract:
With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding
Procedia PDF Downloads 29625181 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework
Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe
Abstract:
This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.Keywords: IoT, fog, cloud, data analysis, data privacy
Procedia PDF Downloads 10025180 Indigenous Patch Clamp Technique: Design of Highly Sensitive Amplifier Circuit for Measuring and Monitoring of Real Time Ultra Low Ionic Current through Cellular Gates
Authors: Moez ul Hassan, Bushra Noman, Sarmad Hameed, Shahab Mehmood, Asma Bashir
Abstract:
The importance of Noble prize winning “Patch Clamp Technique” is well documented. However, Patch Clamp Technique is very expensive and hence hinders research in developing countries. In this paper, detection, processing and recording of ultra low current from induced cells by using transimpedence amplifier is described. The sensitivity of the proposed amplifier is in the range of femto amperes (fA). Capacitive-feedback is used with active load to obtain a 20MΩ transimpedance gain. The challenging task in designing includes achieving adequate performance in gain, noise immunity and stability. The circuit designed by the authors was able to measure current in the rangeof 300fA to 100pA. Adequate performance shown by the amplifier with different input current and outcome result was found to be within the acceptable error range. Results were recorded using LabVIEW 8.5®for further research.Keywords: drug discovery, ionic current, operational amplifier, patch clamp
Procedia PDF Downloads 51925179 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data
Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif
Abstract:
Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.Keywords: field data, local scour, scour equation, wide piers
Procedia PDF Downloads 41525178 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol
Authors: Inkyu Kim, SangMan Moon
Abstract:
This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application
Procedia PDF Downloads 39325177 Tenofovir-Amino Acid Conjugates Act as Polymerase Substrates: Implications for Avoiding Cellular Phosphorylation in the Discovery of Nucleotide Analogs
Authors: Weijie Gu, Sergio Martinez, Hoai Nguyen, Hongtao Xu, Piet Herdewijn, Steven De Jonghe, Kalyan Das
Abstract:
Nucleotide analogs are used for treating viral infections such as HIV, hepatitis B, hepatitis C, influenza, and SARS-CoV-2. To become polymerase substrates, a nucleotide analog must be phosphorylated by cellular kinases, which are rate-limiting. The goal of this study is to develop dNTP/NTP analogs directly from nucleotides. Tenofovir (TFV) analogs were synthesized by conjugating with natural or unnatural amino acids. It demonstrates that some conjugates act as dNTP analogs, and HIV-1 reverse transcriptase (RT) catalytically incorporates the TFV part as the chain terminator. X-ray structures in complex with HIV-1 RT/dsDNA showed binding of the conjugates at the polymerase active site, however, in different modes in the presence of Mg²⁺ vs. Mn²⁺ ions. The adaptability of the compounds is seemingly essential for catalytic incorporation of TFV by RT. 4d with a carboxyl sidechain demonstrated the highest incorporation. 4e showed weak incorporation and rather behaved as a dNTP-competitive inhibitor. This result advocates the feasibility of designing NTP/dNTP analogs by chemical substitutions to nucleotide analogs.Keywords: dNTP analogs, nucleotide analogs, polymerase, tenofovir, X-ray structure
Procedia PDF Downloads 15325176 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning
Procedia PDF Downloads 55225175 Router 1X3 - RTL Design and Verification
Authors: Nidhi Gopal
Abstract:
Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.Keywords: data packets, networking, router, routing
Procedia PDF Downloads 81525174 Transcriptome Analysis for Insights into Disease Progression in Dengue Patients
Authors: Abhaydeep Pandey, Shweta Shukla, Saptamita Goswami, Bhaswati Bandyopadhyay, Vishnampettai Ramachandran, Sudhanshu Vrati, Arup Banerjee
Abstract:
Dengue virus infection is now considered as one of the most important mosquito-borne infection in human. The virus is known to promote vascular permeability, cerebral edema leading to Dengue hemorrhagic fever (DHF) or Dengue shock syndrome (DSS). Dengue infection has known to be endemic in India for over two centuries as a benign and self-limited disease. In the last couple of years, the disease symptoms have changed, manifesting severe secondary complication. So far, Delhi has experienced 12 outbreaks of dengue virus infection since 1997 with the last reported in 2014-15. Without specific antivirals, the case management of high-risk dengue patients entirely relies on supportive care, involving constant monitoring and timely fluid support to prevent hypovolemic shock. Nonetheless, the diverse clinical spectrum of dengue disease, as well as its initial similarity to other viral febrile illnesses, presents a challenge in the early identification of this high-risk group. WHO recommends the use of warning signs to identify high-risk patients, but warning signs generally appear during, or just one day before the development of severe illness, thus, providing only a narrow window for clinical intervention. The ability to predict which patient may develop DHF and DSS may improve the triage and treatment. With the recent discovery of high throughput RNA sequencing allows us to understand the disease progression at the genomic level. Here, we will collate the results of RNA-Sequencing data obtained recently from PBMC of different categories of dengue patients from India and will discuss the possible role of deregulated genes and long non-coding RNAs NEAT1 for development of disease progression.Keywords: long non-coding RNA (lncRNA), dengue, peripheral blood mononuclear cell (PBMC), nuclear enriched abundant transcript 1 (NEAT1), dengue hemorrhagic fever (DHF), dengue shock syndrome (DSS)
Procedia PDF Downloads 30925173 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 265