Search results for: Truth Table S-Boxes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 782

Search results for: Truth Table S-Boxes

722 Smartphones as a Tool of Mobile Journalism in Saudi Arabia

Authors: Ahmed Deen

Abstract:

The introduction of the mobile devices which were equipped with internet access and a camera, as well as the messaging services, has become a major inspiration for the use of the mobile devices in the growth in the reporting of news. Mobile journalism (MOJO) was a creation of modern technology, especially the use of mobile technology for video journalism purposes. MOJO, thus, is the process by which information is collected and disseminated to society, through the use of mobile technology, and even the use of the tablets. This paper seeks to better understand the ethics of Saudi mobile journalists towards news coverage. Also, this study aims to explore the relationship between minimizing harms and truth-seeking efforts among Saudi mobile journalists. Three main ethics were targeted in this study, which are seek truth and report it, minimize harm, and being accountable. Diffusion of innovation theory applied to reach this study’s goals. The non- probability sampling approach, ‘Snowball Sampling’ was used to target 124 survey participants, an online survey via SurveyMonkey that was distributed through social media platforms as a web link. The code of ethics of the Society of Professional Journalists has applied as a scale in this study. This study found that the relationship between minimizing harm and truth-seeking efforts is significantly moderate among Saudi mobile journalists. Also, it is found that the level journalistic experiences and using smartphones to cover news are weakly and negatively related to the perceptions of mobile journalism among Saudi journalists, while Saudi journalists who use their smartphone to cover the news between 1-3 years, were the majority of participants (55 participants by 51.4%).

Keywords: mobile journalism, Saudi journalism, smartphone, Saudi Arabia

Procedia PDF Downloads 138
721 Coronavirus Academic Paper Sorting Application

Authors: Christina A. van Hal, Xiaoqian Jiang, Luyao Chen, Yan Chu, Robert D. Jolly, Yaobin Lin, Jitian Zhao, Kang Lin Hsieh

Abstract:

The COVID-19 Literature Summary App was created for the primary purpose of enabling academicians and clinicians to quickly sort through the vast array of recent coronavirus publications by topics of interest. Multiple methods of summarizing and sorting the manuscripts were created. A summary page introduces the application function and capabilities, while an interactive map provides daily updates on infection, death, and recovery rates. A page with a pivot table allows publication sorting by topic, with an interactive data table that allows sorting topics by columns, as wells as the capability to view abstracts. Additionally, publications may be sorted by the medical topics they cover. We used the CORD-19 database to compile lists of publications. The data table can sort binary variables, allowing the user to pick desired publication topics, such as papers that describe COVID-19 symptoms. The application is primarily designed for use by researchers but can be used by anybody who wants a faster and more efficient means of locating papers of interest.

Keywords: COVID-19, literature summary, information retrieval, Snorkel

Procedia PDF Downloads 122
720 DNA as an Instrument in Constructing Narratives and Justice in Criminal Investigations: A Socio-Epistemological Exploration

Authors: Aadita Chaudhury

Abstract:

Since at least the early 2000s, DNA profiling has achieved a preeminent status in forensic investigations into criminal acts. While the criminal justice system has a long history of using forensic evidence and testing them through establish technoscientific means, the primacy of DNA in establishing 'truth' or reconstructing a series of events is unparalleled in the history of forensic science. This paper seeks to elucidate the ways in which DNA profiling has become the most authoritative instrument of 'truth' in criminal investigations, and how it is used in the legal process to ascertain culpability, create the notion of infallible evidence, and advance the search for justice. It is argued that DNA profiling has created a paradigm shift in how the legal system and the general public understands crime and culpability, but not without limitations. There are indications that even trace amounts of DNA evidence can point to causal links in a criminal investigation, however, there still remains many rooms to create confusion and doubt from empirical evidence within the narrative of crimes. Many of the shortcomings of DNA-based forensic investigations are explored and evaluated with regards to claims of the authority of biological evidence and implications for the public understanding of the elusive concepts of truth and justice in the present era. Public misinformation about the forensic analysis processes could produce doubt or faith in the judgements rooted in them, depending on other variables presented at the trial. A positivist understanding of forensic science that is shared by the majority of the population does not take into consideration that DNA evidence is far from definitive, and can be used to support any theories of culpability, to create doubt and to deflect blame.

Keywords: DNA profiling, epistemology of forensic science, philosophy of forensic science, sociology of scientific knowledge

Procedia PDF Downloads 176
719 The Influence of Strengthening on the Fundamental Frequency and Stiffness of a Confined Masonry Wall with an Opening for а Window

Authors: Emin Z. Mahmud

Abstract:

Shaking table tests are planned in order to deepen the understanding of the behavior of confined masonry structures with or without openings. The tests are realized in the laboratory of the Institute of Earthquake Engineering and Engineering Seismology (IZIIS) – Skopje. The specimens were examined separately on the shaking table, with uniaxial, in-plane excitation. After testing, samples were strengthened with GFRP (Glass Fiber Reinforced Plastic) and re-tested. This paper presents the observations from a series of shaking-table tests done on a 1:1 scaled confined masonry wall model, with opening for a window – specimens CMWuS (before strengthening) and CMWS (after strengthening). Frequency and stiffness changes before and after GFRP wall strengthening are analyzed. Definition of dynamic properties of the models was the first step of the experimental testing, which enabled acquiring important information about the achieved stiffness (natural frequencies) of the model. The natural frequency was defined in the Y direction of the model by applying resonant frequency search tests. It is important to mention that both specimens CMWuS and CMWS are subjected to the same effects. The initial frequency of the undamaged model CMWuS is 18.79 Hz, while at the end of the testing, the frequency decreased to 12.96 Hz. This emphasizes the reduction of the initial stiffness of the model due to damage, especially in the masonry and tie-beam to tie-column connection. After strengthening the damaged wall, the natural frequency increases to 14.67 Hz. This highlights the beneficial effect of strengthening. After completion of dynamic testing at CMWS, the natural frequency is reduced to 10.75 Hz.

Keywords: behaviour of masonry structures, Eurocode, frequency, masonry, shaking table test, strengthening

Procedia PDF Downloads 93
718 A Study of Life Expectancy in an Urban Set up of North-Eastern India under Dynamic Consideration Incorporating Cause Specific Mortality

Authors: Mompi Sharma, Labananda Choudhury, Anjana M. Saikia

Abstract:

Background: The period life table is entirely based on the assumption that the mortality patterns of the population existing in the given period will persist throughout their lives. However, it has been observed that the mortality rate continues to decline. As such, if the rates of change of probabilities of death are considered in a life table then we get a dynamic life table. Although, mortality has been declining in all parts of India, one may be interested to know whether these declines had appeared more in an urban area of underdeveloped regions like North-Eastern India. So, attempt has been made to know the mortality pattern and the life expectancy under dynamic scenario in Guwahati, the biggest city of North Eastern India. Further, if the probabilities of death changes then there is a possibility that its different constituent probabilities will also change. Since cardiovascular disease (CVD) is the leading cause of death in Guwahati. Therefore, an attempt has also been made to formulate dynamic cause specific death ratio and probabilities of death due to CVD. Objectives: To construct dynamic life table for Guwahati for the year 2011 based on the rates of change of probabilities of death over the previous 10 and 25 years (i.e.,2001 and 1986) and to compute corresponding dynamic cause specific death ratio and probabilities of death due to CVD. Methodology and Data: The study uses the method proposed by Denton and Spencer (2011) to construct dynamic life table for Guwahati. So, the data from the Office of the Birth and Death, Guwahati Municipal Corporation for the years 1986, 2001 and 2011 are taken. The population based data are taken from 2001 and 2011 census (India). However, the population data for 1986 has been estimated. Also, the cause of death ratio and probabilities of death due to CVD are computed for the aforementioned years and then extended to dynamic set up for the year 2011 by considering the rates of change of those probabilities over the previous 10 and 25 years. Findings: The dynamic life expectancy at birth (LEB) for Guwahati is found to be higher than the corresponding values in the period table by 3.28 (5.65) years for males and 8.30 (6.37) years for females during the period of 10 (25) years. The life expectancies under dynamic consideration in all the other age groups are also seen higher than the usual life expectancies, which may be possible due to gradual decline in probabilities of death since 1986-2011. Further, a continuous decline has also been observed in death ratio due to CVD along with cause specific probabilities of death for both sexes. As a consequence, dynamic cause of death probability due to CVD is found to be less in comparison to usual procedure. Conclusion: Since incorporation of changing mortality rates in period life table for Guwahati resulted in higher life expectancies and lower probabilities of death due to CVD, this would possibly bring out the real situation of deaths prevailing in the city.

Keywords: cause specific death ratio, cause specific probabilities of death, dynamic, life expectancy

Procedia PDF Downloads 213
717 The Analysis of Deceptive and Truthful Speech: A Computational Linguistic Based Method

Authors: Seham El Kareh, Miramar Etman

Abstract:

Recently, detecting liars and extracting features which distinguish them from truth-tellers have been the focus of a wide range of disciplines. To the author’s best knowledge, most of the work has been done on facial expressions and body gestures but only few works have been done on the language used by both liars and truth-tellers. This paper sheds light on four axes. The first axis copes with building an audio corpus for deceptive and truthful speech for Egyptian Arabic speakers. The second axis focuses on examining the human perception of lies and proving our need for computational linguistic-based methods to extract features which characterize truthful and deceptive speech. The third axis is concerned with building a linguistic analysis program that could extract from the corpus the inter- and intra-linguistic cues for deceptive and truthful speech. The program built here is based on selected categories from the Linguistic Inquiry and Word Count program. Our results demonstrated that Egyptian Arabic speakers on one hand preferred to use first-person pronouns and present tense compared to the past tense when lying and their lies lacked of second-person pronouns, and on the other hand, when telling the truth, they preferred to use the verbs related to motion and the nouns related to time. The results also showed that there is a need for bigger data to prove the significance of words related to emotions and numbers.

Keywords: Egyptian Arabic corpus, computational analysis, deceptive features, forensic linguistics, human perception, truthful features

Procedia PDF Downloads 180
716 Wavelets Contribution on Textual Data Analysis

Authors: Habiba Ben Abdessalem

Abstract:

The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.

Keywords: textual data, wavelet, denoising, contingency table

Procedia PDF Downloads 255
715 The Feasibility of a Protected Launch Site near Melkbosstrand for a Public Transport Ferry across Table Bay, Cape Town

Authors: Mardi Falck, André Theron

Abstract:

Traffic congestion on the Northern side of Table Bay is a major problem. In Gauteng, the implementation of the Gautrain between Pretoria and Johannesburg, solved their traffic congestion. In 2002 two entrepreneurs endeavoured to implement a hovercraft ferry service across the bay from Table View to the Port of Cape Town. However, the EIA process proved that disgruntled residents from the area did not agree with their location for a launch site. 17 years later the traffic problem has not gone away, but instead the congestion has increased. While property prices in the City Bowl of Cape Town are ever increasing, people tend to live more on the outskirts of the CBD and commute to work. This means more vehicles on the road every day and the public transport services cannot keep up with the demand. For this reason, the study area of the previous hovercraft plans is being extended further North. The study’s aim is thus to determine the feasibility of a launch site North of Bloubergstrand to launch and receive a public transport ferry across Table Bay. The feasibility is being established by researching ferry services across the world and on what makes them successful. Different types of ferries and their operational capacities in terms of weather and waves are researched and by establishing the offshore and nearshore wind and wave climate for the area, an appropriate protected launch site is determined. It was concluded that travel time could potentially be halved. A hovercraft proved to be the most feasible ferry type, because it does not require a conventional harbour. Other types of vessels require a protected launch site because of the wave climate. This means large breakwaters that influence the cost substantially. The Melkbos Cultural Centre proved to be the most viable option for the location of the launch site, because it already has buildings and infrastructure. It is recommended that, if a harbour is chosen for the proposed ferry service, it could be used for more services like fishing, eco-tourism and leisure. Further studies are recommended to optimise the feasibility of such a harbour.

Keywords: Cape Town, ferry, public, Table Bay

Procedia PDF Downloads 126
714 Economics of Precision Mechanization in Wine and Table Grape Production

Authors: Dean A. McCorkle, Ed W. Hellman, Rebekka M. Dudensing, Dan D. Hanselka

Abstract:

The motivation for this study centers on the labor- and cost-intensive nature of wine and table grape production in the U.S., and the potential opportunities for precision mechanization using robotics to augment those production tasks that are labor-intensive. The objectives of this study are to evaluate the economic viability of grape production in five U.S. states under current operating conditions, identify common production challenges and tasks that could be augmented with new technology, and quantify a maximum price for new technology that growers would be able to pay. Wine and table grape production is primed for precision mechanization technology as it faces a variety of production and labor issues. Methodology: Using a grower panel process, this project includes the development of a representative wine grape vineyard in five states and a representative table grape vineyard in California. The panels provided production, budget, and financial-related information that are typical for vineyards in their area. Labor costs for various production tasks are of particular interest. Using the data from the representative budget, 10-year projected financial statements have been developed for the representative vineyard and evaluated using a stochastic simulation model approach. Labor costs for selected vineyard production tasks were evaluated for the potential of new precision mechanization technology being developed. These tasks were selected based on a variety of factors, including input from the panel members, and the extent to which the development of new technology was deemed to be feasible. The net present value (NPV) of the labor cost over seven years for each production task was derived. This allowed for the calculation of a maximum price for new technology whereby the NPV of labor costs would equal the NPV of purchasing, owning, and operating new technology. Expected Results: The results from the stochastic model will show the projected financial health of each representative vineyard over the 2015-2024 timeframe. Investigators have developed a preliminary list of production tasks that have the potential for precision mechanization. For each task, the labor requirements, labor costs, and the maximum price for new technology will be presented and discussed. Together, these results will allow technology developers to focus and prioritize their research and development efforts for wine and table grape vineyards, and suggest opportunities to strengthen vineyard profitability and long-term viability using precision mechanization.

Keywords: net present value, robotic technology, stochastic simulation, wine and table grapes

Procedia PDF Downloads 233
713 A Strategy of Direct Power Control for PWM Rectifier Reducing Ripple in Instantaneous Power

Authors: T. Mohammed Chikouche, K. Hartani

Abstract:

In order to solve the instantaneous power ripple and achieve better performance of direct power control (DPC) for a three-phase PWM rectifier, a control method is proposed in this paper. This control method is applied to overcome the instantaneous power ripple, to eliminate line current harmonics and therefore reduce the total harmonic distortion and to improve the power factor. A switching table is based on the analysis on the change of instantaneous active and reactive power, to select the optimum switching state of the three-phase PWM rectifier. The simulation result shows feasibility of this control method.

Keywords: power quality, direct power control, power ripple, switching table, unity power factor

Procedia PDF Downloads 289
712 Experimental Analysis of Tuned Liquid Damper (TLD) with Embossments Subject to Random Excitation

Authors: Mohamad Saberi, Arash Sohrabi

Abstract:

Tuned liquid damper is one the passive structural control ways which has been used since mid-1980 decade for seismic control in civil engineering. This system is made of one or many tanks filled with fluid, mostly water that installed on top of the high raised structure and used to prevent structure vibration. In this article we will show how to make seismic table contain TLD system and analysis the result of using this system in our structure. Results imply that when frequency ratio approaches 1 this system can perform its best in both dissipate energy and increasing structural damping. And also results of these serial experiments are proved compatible with Hunzer linear theory behaviour.

Keywords: TLD, seismic table, structural system, Hunzer linear behaviour

Procedia PDF Downloads 351
711 Evaluation of UI for 3D Visualization-Based Building Information Applications

Authors: Monisha Pattanaik

Abstract:

In scenarios where users have to work with large amounts of hierarchical data structures combined with visualizations (For example, Construction 3d Models, Manufacturing equipment's models, Gantt charts, Building Plans), the data structures have a high density in terms of consisting multiple parent nodes up to 50 levels and their siblings to descendants, therefore convey an immediate feeling of complexity. With customers moving to consumer-grade enterprise software, it is crucial to make sophisticated features made available to touch devices or smaller screen sizes. This paper evaluates the UI component that allows users to scroll through all deep density levels using a slider overlay on top of the hierarchy table, performing several actions to focus on one set of objects at any point in time. This overlay component also solves the problem of excessive horizontal scrolling of the entire table on a fixed pane for a hierarchical table. This component can be customized to navigate through parents, only siblings, or a specific component of the hierarchy only. The evaluation of the UI component was done by End Users of application and Human-Computer Interaction (HCI) experts to test the UI component's usability with statistical results and recommendations to handle complex hierarchical data visualizations.

Keywords: building information modeling, digital twin, navigation, UI component, user interface, usability, visualization

Procedia PDF Downloads 111
710 Cognitive Methods for Detecting Deception During the Criminal Investigation Process

Authors: Laid Fekih

Abstract:

Background: It is difficult to detect lying, deception, and misrepresentation just by looking at verbal or non-verbal expression during the criminal investigation process, as there is a common belief that it is possible to tell whether a person is lying or telling the truth just by looking at the way they act or behave. The process of detecting lies and deception during the criminal investigation process needs more studies and research to overcome the difficulties facing the investigators. Method: The present study aimed to identify the effectiveness of cognitive methods and techniques in detecting deception during the criminal investigation. It adopted the quasi-experimental method and covered a sample of (20) defendants distributed randomly into two homogeneous groups, an experimental group of (10) defendants be subject to criminal investigation by applying cognitive techniques to detect deception and a second experimental group of (10) defendants be subject to the direct investigation method. The tool that used is a guided interview based on models of investigative questions according to the cognitive deception detection approach, which consists of three techniques of Vrij: imposing the cognitive burden, encouragement to provide more information, and ask unexpected questions, and the Direct Investigation Method. Results: Results revealed a significant difference between the two groups in term of lie detection accuracy in favour of defendants be subject to criminal investigation by applying cognitive techniques, the cognitive deception detection approach produced superior total accuracy rates both with human observers and through an analysis of objective criteria. The cognitive deception detection approach produced superior accuracy results in truth detection: 71%, deception detection: 70% compared to a direct investigation method truth detection: 52%; deception detection: 49%. Conclusion: The study recommended if practitioners use a cognitive deception detection technique, they will correctly classify more individuals than when they use a direct investigation method.

Keywords: the cognitive lie detection approach, deception, criminal investigation, mental health

Procedia PDF Downloads 42
709 Experimental Analysis of Tuned Liquid Damper (TLD) for High Raised Structures

Authors: Mohamad Saberi, Arash Sohrabi

Abstract:

Tuned liquid damper is one the passive structural control ways which has been used since mid-1980 decade for seismic control in civil engineering. This system is made of one or many tanks filled with fluid, mostly water that installed on top of the high raised structure and used to prevent structure vibration. In this article, we will show how to make seismic table contain TLD system and analysis the result of using this system in our structure. Results imply that when frequency ratio approaches 1 this system can perform its best in both dissipate energy and increasing structural damping. And also results of these serial experiments are proved compatible with Hunzer linear theory behaviour.

Keywords: TLD, seismic table, structural system, Hunzer linear behaviour

Procedia PDF Downloads 299
708 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.

Keywords: rule induction, decision table, missing data, noise

Procedia PDF Downloads 365
707 Variations in Water Supply and Quality in Selected Groundwater Sources in a Part of Southwest Nigeria

Authors: Samuel Olajide Babawale, O. O. Ogunkoya

Abstract:

The study mapped selected wells in Inisa town, Osun state, in the guinea savanna region of southwest Nigeria, and determined the water quality considering certain elements. It also assessed the variation in the elevation of the water table surface to depth of the wells in the months of August and November. This is with a view to determine the level of contamination of the water with respect to land use and anthropogenic activities, and also to determine the variation that occurs in the quantity of well water in the rainy season and the start of the dry season. Results show a random pattern of the distribution of the mapped wells and shows that there is a shallow water table in the study area. The temporal changes in the elevation show that there are no significant variations in the depth of the water table surface over the period of study implying that there is a sufficient amount of water available to the town all year round. It also shows a high concentration of sodium in the water sample analyzed compared to other elements that were considered, which include iron, copper, calcium, and lead. This is attributed majorly to anthropogenic activities through the disposal of waste in landfill sites. There is a low concentration of lead which is a good indication of a reduced level of pollution.

Keywords: anthropogenic activities, land use, temporal changes, water quality

Procedia PDF Downloads 113
706 Developing an Intelligent Table Tennis Ball Machine with Human Play Simulation for Technical Training

Authors: Chen-Chi An, Jun-Yi He, Cheng-Han Hsieh, Chen-Ching Ting

Abstract:

This research has successfully developed an intelligent table tennis ball machine with human play simulate all situations of human play to take the service. It is well known; an excellent ball machine can help the table tennis coach to provide more efficient teaching, also give players the good technical training and entertainment. An excellent ball machine should be able to service all balls based on human play simulation due to the conventional competitions are today all taken place for people. In this work, two counter-rotating wheels are used to service the balls, where changing the absolute rotating speeds of the two wheels and the differences of rotating speeds between the two wheels can adjust the struck forces and the rotating speeds of the ball. The relationships between the absolute rotating speed of the two wheels and the struck forces of the ball as well as the differences rotating speeds between the two wheels and the rotating speeds of the ball are experimentally determined for technical development. The outlet speed, the ejected distance, and the rotating speed of the ball were measured by changing the absolute rotating speeds of the two wheels in terms of a series of differences in rotating speed between the two wheels for calibration of the ball machine; where the outlet speed and the ejected distance of the ball were further converted to the struck forces of the ball. In process, the balls serviced by the intelligent ball machine were based on the received calibration curves with help of the computer. Experiments technically used photosensitive devices to detect the outlet and rotating speed of the ball. Finally, this research developed some teaching programs for technical training using three ball machines and received more efficient training.

Keywords: table tennis, ball machine, human play simulation, counter-rotating wheels

Procedia PDF Downloads 399
705 Design Analysis of Tilting System for Spacecraft Transportation

Authors: P. Naresh, Amir Iqbal

Abstract:

Satellite transportation is inevitable step during the course of integration testing and launch. Large satellites are transported in horizontal mode due to constraints on commercially available cargo bay dimensions & on road obstacles. To facilitate transportation of bigger size spacecraft in horizontal mode a tilting system is released. This tilting system consists of tilt table, columns, hinge pin, angular contact bearings, slewing bearing and linear actuators. The tilting system is very compact and easy to use however it is also serves the purpose of a fixture so it is of immense interest to know the stress and fundamental frequency of the system in transportation configuration. This paper discusses design aspects and finite element analysis of tilting system-cum-fixture using Hypermesh/Nastran.

Keywords: tilt table, column, slewing bearing, stress, modal analysis

Procedia PDF Downloads 552
704 Symmetric Arabic Language Encryption Technique Based on Modified Playfair Algorithm

Authors: Fairouz Beggas

Abstract:

Due to the large number of exchanges in the networks, the security of communications is essential. Most ways of keeping communication secure rely on encryption. In this work, a symmetric encryption technique is offered to encrypt and decrypt simple Arabic scripts based on a multi-level security. A proposed technique uses an idea of Playfair encryption with a larger table size and an additional layer of encryption to ensure more security. The idea of the proposed algorithm aims to generate a dynamic table that depends on a secret key. The same secret key is also used to create other secret keys to over-encrypt the plaintext in three steps. The obtained results show that the proposed algorithm is faster in terms of encryption/decryption speed and can resist to many types of attacks.

Keywords: arabic data, encryption, playfair, symmetric algorithm

Procedia PDF Downloads 54
703 A Novel Approach to Design and Implement Context Aware Mobile Phone

Authors: G. S. Thyagaraju, U. P. Kulkarni

Abstract:

Context-aware computing refers to a general class of computing systems that can sense their physical environment, and adapt their behaviour accordingly. Context aware computing makes systems aware of situations of interest, enhances services to users, automates systems and personalizes applications. Context-aware services have been introduced into mobile devices, such as PDA and mobile phones. In this paper we are presenting a novel approaches used to realize the context aware mobile. The context aware mobile phone (CAMP) proposed in this paper senses the users situation automatically and provides user context required services. The proposed system is developed by using artificial intelligence techniques like Bayesian Network, fuzzy logic and rough sets theory based decision table. Bayesian Network to classify the incoming call (high priority call, low priority call and unknown calls), fuzzy linguistic variables and membership degrees to define the context situations, the decision table based rules for service recommendation. To exemplify and demonstrate the effectiveness of the proposed methods, the context aware mobile phone is tested for college campus scenario including different locations like library, class room, meeting room, administrative building and college canteen.

Keywords: context aware mobile, fuzzy logic, decision table, Bayesian probability

Procedia PDF Downloads 338
702 A Conglomerate of Multiple Optical Character Recognition Table Detection and Extraction

Authors: Smita Pallavi, Raj Ratn Pranesh, Sumit Kumar

Abstract:

Information representation as tables is compact and concise method that eases searching, indexing, and storage requirements. Extracting and cloning tables from parsable documents is easier and widely used; however, industry still faces challenges in detecting and extracting tables from OCR (Optical Character Recognition) documents or images. This paper proposes an algorithm that detects and extracts multiple tables from OCR document. The algorithm uses a combination of image processing techniques, text recognition, and procedural coding to identify distinct tables in the same image and map the text to appropriate the corresponding cell in dataframe, which can be stored as comma-separated values, database, excel, and multiple other usable formats.

Keywords: table extraction, optical character recognition, image processing, text extraction, morphological transformation

Procedia PDF Downloads 117
701 Semi-Supervised Learning Using Pseudo F Measure

Authors: Mahesh Balan U, Rohith Srinivaas Mohanakrishnan, Venkat Subramanian

Abstract:

Positive and unlabeled learning (PU) has gained more attention in both academic and industry research literature recently because of its relevance to existing business problems today. Yet, there still seems to be some existing challenges in terms of validating the performance of PU learning, as the actual truth of unlabeled data points is still unknown in contrast to a binary classification where we know the truth. In this study, we propose a novel PU learning technique based on the Pseudo-F measure, where we address this research gap. In this approach, we train the PU model to discriminate the probability distribution of the positive and unlabeled in the validation and spy data. The predicted probabilities of the PU model have a two-fold validation – (a) the predicted probabilities of reliable positives and predicted positives should be from the same distribution; (b) the predicted probabilities of predicted positives and predicted unlabeled should be from a different distribution. We experimented with this approach on a credit marketing case study in one of the world’s biggest fintech platforms and found evidence for benchmarking performance and backtested using historical data. This study contributes to the existing literature on semi-supervised learning.

Keywords: PU learning, semi-supervised learning, pseudo f measure, classification

Procedia PDF Downloads 203
700 Dietary Exposure to Pesticide Residues by Various Physiological Groups of Population in Andhra Pradesh, South India

Authors: Padmaja R. Jonnalagadda

Abstract:

Dietary exposure assessment of fifteen pesticide residues was done in Andhra Pradesh. Twelve commonly consumed foods including water, which were representative of the diet, were collected, processed as table ready and analysed for the presence of various Organochlorines, organophosphates and synthetic pyrethroids. All the samples were contaminated with one or more of the 15 pesticide residues and all of them were within the MRLs. DDT and its isomers, Chlorpyriphos and Cypermethrin were frequently detected in many of the food samples. The mean concentration of the pesticide residues ranged from 0.02 μg kg-1 to 5.1 μg kg-1 (fresh weight) in the analysed foods. When exposure assessments was carried out for different age, sex and physiological groups it was found that the estimates of daily dietary intakes of the analysed pesticide residues in the present study are much lower than the violative levels in all age groups that were computed.

Keywords: table ready foods, pesticide residues, dietary intake, physiological groups, risk

Procedia PDF Downloads 487
699 Tailoring Polycrystalline Diamond for Increasing Earth-Drilling Challenges

Authors: Jie Chen, Chris Cheng, Kai Zhang

Abstract:

Polycrystalline diamond compact (PDC) cutters with a polycrystalline diamond (PCD) table supported by a cemented tungsten carbide substrate have been widely used for earth-drilling tools in the oil and gas industry. Both wear and impact resistances are key figure of merits of PDC cutters, and they are closely related to the microstructure of the PCD table. As oil and gas exploration enters deeper, harder, and more complex formations, plus increasing requirement of accelerated downhole drilling speed and drilling cost reduction, current PDC cutters face unprecedented challenges for maintaining a longer drilling life than ever. Excessive wear on uneven hard formations, spalling, chipping, and premature fracture due to impact loads are common failure modes of PDC cutters in the field. Tailoring microstructure of the PCD table is one of the effective approaches to improve the wear and impact resistances of PDC cutters, along with other factors such as cutter geometry and bit design. In this research, cross-sectional microstructure, fracture surface, wear surface, and elemental composition of PDC cutters were analyzed using scanning electron microscopy (SEM) with both backscattered electron and secondary electron detectors, and energy dispersive X-ray spectroscopy (EDS). The microstructure and elemental composition were further correlated with the wear and impact resistances of corresponding PDC cutters. Wear modes and impact toughening mechanisms of state-of-the-art PDCs were identified. Directions to further improve the wear and impact resistances of PDC cutters were proposed.

Keywords: fracture surface, microstructure, polycrystalline diamond, PDC, wear surface

Procedia PDF Downloads 25
698 The Implementation of the Javanese Lettered-Manuscript Image Preprocessing Stage Model on the Batak Lettered-Manuscript Image

Authors: Anastasia Rita Widiarti, Agus Harjoko, Marsono, Sri Hartati

Abstract:

This paper presents the results of a study to test whether the Javanese character manuscript image preprocessing model that have been more widely applied, can also be applied to segment of the Batak characters manuscripts. The treatment process begins by converting the input image into a binary image. After the binary image is cleaned of noise, then the segmentation lines using projection profile is conducted. If unclear histogram projection is found, then the smoothing process before production indexes line segments is conducted. For each line image which has been produced, then the segmentation scripts in the line is applied, with regard of the connectivity between pixels which making up the letters that there is no characters are truncated. From the results of manuscript preprocessing system prototype testing, it is obtained the information about the system truth percentage value on pieces of Pustaka Batak Podani Ma AjiMamisinon manuscript ranged from 65% to 87.68% with a confidence level of 95%. The value indicates the truth percentage shown the initial processing model in Javanese characters manuscript image can be applied also to the image of the Batak characters manuscript.

Keywords: connected component, preprocessing, manuscript image, projection profiles

Procedia PDF Downloads 372
697 Developing a Rational Database Management System (RDBMS) Supporting Product Life Cycle Appications

Authors: Yusri Yusof, Chen Wong Keong

Abstract:

This paper presents the implementation details of a Relational Database Management System of a STEP-technology product model repository. It is able support the implementation of any EXPRESS language schema, although it has been primarily implemented to support mechanical product life cycle applications. This database support the input of STEP part 21 file format from CAD in geometrical and topological data format and support a range of queries for mechanical product life cycle applications. This proposed relational database management system uses entity-to-table method (R1) rather than type-to-table method (R4). The two mapping methods have their own strengths and drawbacks.

Keywords: RDBMS, CAD, ISO 10303, part-21 file

Procedia PDF Downloads 508
696 Experimental Evaluation of Foundation Settlement Mitigations in Liquefiable Soils using Press-in Sheet Piling Technique: 1-g Shake Table Tests

Authors: Md. Kausar Alam, Ramin Motamed

Abstract:

The damaging effects of liquefaction-induced ground movements have been frequently observed in past earthquakes, such as the 2010-2011 Canterbury Earthquake Sequence (CES) in New Zealand and the 2011 Tohoku earthquake in Japan. To reduce the consequences of soil liquefaction at shallow depths, various ground improvement techniques have been utilized in engineering practice, among which this research is focused on experimentally evaluating the press-in sheet piling technique. The press-in sheet pile technique eliminates the vibration, hammering, and noise pollution associated with dynamic sheet pile installation methods. Unfortunately, there are limited experimental studies on the press-in sheet piling technique for liquefaction mitigation using 1g shake table tests in which all the controlling mechanisms of liquefaction-induced foundation settlement, including sand ejecta, can be realistically reproduced. In this study, a series of moderate scale 1g shake table experiments were conducted at the University of Nevada, Reno, to evaluate the performance of this technique in liquefiable soil layers. First, a 1/5 size model was developed based on a recent UC San Diego shaking table experiment. The scaled model has a density of 50% for the top crust, 40% for the intermediate liquefiable layer, and 85% for the bottom dense layer. Second, a shallow foundation is seated atop an unsaturated sandy soil crust. Third, in a series of tests, a sheet pile with variable embedment depth is inserted into the liquefiable soil using the press-in technique surrounding the shallow foundations. The scaled models are subjected to harmonic input motions with amplitude and dominant frequency properly scaled based on the large-scale shake table test. This study assesses the performance of the press-in sheet piling technique in terms of reductions in the foundation movements (settlement and tilt) and generated excess pore water pressures. In addition, this paper discusses the cost-effectiveness and carbon footprint features of the studied mitigation measures.

Keywords: excess pore water pressure, foundation settlement, press-in sheet pile, soil liquefaction

Procedia PDF Downloads 72
695 Tomato Peels Prevented Margarine and Soya/Sunflower Oils Oxidation

Authors: S. Zidani, A. Benakmoum, A. Mansouri, A. Ammouche

Abstract:

In this research paper, we studied the oxidative stability of table margarine and soya/sunflower oils rich in lycopene with tomato peel powder (TPP). For this 1%, 2%, and 3% (w/w) of TPP was added to oil used in margarine manufacture. Chromatic characteristics of margarine and soya/sunflower oil have been studied using 'Tristimulus Colorimetry' method. The main point of the research was to determine the antioxidant activity and the oxidative resistance of soya/sunflower and margarine with TPP (peroxide index, TBA index, and rancimat test). The sensory and textural properties, overall acceptability of margarine and oil were good, indicating that TPP could be added to oil to produce a margarine enriched in lycopene with excellent stability oxidative.

Keywords: tomato peel powder, lycopene, table margarine, soya/sunflower oils, antioxidant activity, stability oxidative

Procedia PDF Downloads 270
694 Application of Stochastic Models on the Portuguese Population and Distortion to Workers Compensation Pensioners Experience

Authors: Nkwenti Mbelli Njah

Abstract:

This research was motivated by a project requested by AXA on the topic of pensions payable under the workers compensation (WC) line of business. There are two types of pensions: the compulsorily recoverable and the not compulsorily recoverable. A pension is compulsorily recoverable for a victim when there is less than 30% of disability and the pension amount per year is less than six times the minimal national salary. The law defines that the mathematical provisions for compulsory recoverable pensions must be calculated by applying the following bases: mortality table TD88/90 and rate of interest 5.25% (maybe with rate of management). To manage pensions which are not compulsorily recoverable is a more complex task because technical bases are not defined by law and much more complex computations are required. In particular, companies have to predict the amount of payments discounted reflecting the mortality effect for all pensioners (this task is monitored monthly in AXA). The purpose of this research was thus to develop a stochastic model for the future mortality of the worker’s compensation pensioners of both the Portuguese market workers and AXA portfolio. Not only is past mortality modeled, also projections about future mortality are made for the general population of Portugal as well as for the two portfolios mentioned earlier. The global model was split in two parts: a stochastic model for population mortality which allows for forecasts, combined with a point estimate from a portfolio mortality model obtained through three different relational models (Cox Proportional, Brass Linear and Workgroup PLT). The one-year death probabilities for ages 0-110 for the period 2013-2113 are obtained for the general population and the portfolios. These probabilities are used to compute different life table functions as well as the not compulsorily recoverable reserves for each of the models required for the pensioners, their spouses and children under 21. The results obtained are compared with the not compulsory recoverable reserves computed using the static mortality table (TD 73/77) that is currently being used by AXA, to see the impact on this reserve if AXA adopted the dynamic tables.

Keywords: compulsorily recoverable, life table functions, relational models, worker’s compensation pensioners

Procedia PDF Downloads 141
693 Prevalence of Dengue in Sickle Cell Disease in Pre-school Children

Authors: Nikhil A. Gavhane, Sachin Shah, Ishant S. Mahajan, Pawan D. Bahekar

Abstract:

Introduction: Millions of people are affected with dengue fever every year, which drives up healthcare expenses in many low-income countries. Organ failure and other serious symptoms may result. Another worldwide public health problem is sickle cell anaemia, which is most prevalent in Africa, the Caribbean, and Europe. Dengue epidemics have reportedly occurred in locations with a high frequency of sickle cell disease, compounding the health problems in these areas. Aims and Objectives: This study examines dengue infection in sickle cell disease-afflicted pre-schoolers. Method:This Retrospective cohort study examined paediatric patients. Young people with sickle cell disease (SCD), dengue infection, and a control group without SCD or dengue were studied. Data on demographics, SCD consequences, medical treatments, and laboratory findings were gathered to analyse the influence of SCD on dengue severity and clinical outcomes, classified as severe or non-severe by the 2009 WHO classification. Using fever or admission symptoms, the research estimated acute illness duration. Result: Table 1 compares haemoglobin genotype-based dengue episode features in SS, SC, and controls. Table 2 shows that severe dengue cases are older, have longer admission delays, and have particular symptoms. Table 3's multivariate analysis indicates SS genotype's high connection with severe dengue, multiorgan failure, and acute pulmonary problems. Table 4 relates severe dengue to greater white blood cell counts, anaemia, liver enzymes, and reduced lactate dehydrogenase. Conclusion: This study is valuable but confined to hospitalised dengue patients with sickle cell illness. Small cohorts limit comparisons. Further study is needed since findings contradict predictions.

Keywords: dengue, chills, headache, severe myalgia, vomiting, nausea, prostration

Procedia PDF Downloads 37