Search results for: heading time
15067 Comparison of Susceptibility to Measles in Preterm Infants versus Term Infants
Authors: Joseph L. Mathew, Shourjendra N. Banerjee, R. K. Ratho, Sourabh Dutta, Vanita Suri
Abstract:
Background: In India and many other developing countries, a single dose of measles vaccine is administered to infants at 9 months of age. This is based on the assumption that maternal transplacentally transferred antibodies will protect infants until that age. However, our previous data showed that most infants lose maternal anti-measles antibodies before 6 months of age, making them susceptible to measles before vaccination at 9 months. Objective: This prospective study was designed to compare susceptibility in pre-term vs term infants, at different time points. Material and Methods: Following Institutional Ethics Committee approval and a formal informed consent process, venous blood was drawn from a cohort of 45 consecutive term infants and 45 consecutive pre-term infants (both groups delivered by the vaginal route); at birth, 3 months, 6 months and 9 months (prior to measles vaccination). Serum was separated and anti-measles IgG antibody levels were measured by quantitative ELISA kits (with sensitivity and specificity > 95%). Susceptibility to measles was defined as antibody titre < 200mIU/ml. The mean antibody levels were compared between the two groups at the four time points. Results: The mean gestation of term babies was 38.5±1.2 weeks; and pre-term babies 34.7±2.8 weeks. The respective mean birth weights were 2655±215g and 1985±175g. Reliable maternal vaccination record was available in only 7 of the 90 mothers. Mean anti-measles IgG antibody (±SD) in terms babies was 3165±533 IU/ml at birth, 1074±272 IU/ml at 3 months, 314±153 IU/ml at 6 months, and 68±21 IU/ml at 9 months. The corresponding levels in pre-term babies were 2875±612 IU/ml, 948±377 IU/ml, 265±98 IU/ml, and 72±33 IU/ml at 9 months (p > 0.05 for all inter-group comparisons). The proportion of susceptible term infants at birth, 3months, 6months and 9months was 0%, 16%, 67% and 96%. The corresponding proportions in the pre-term infants were 0%, 29%, 82%, and 100% (p > 0.05 for all inter-group comparisons). Conclusion: Majority of infants are susceptible to measles before 9 months of age suggesting the need to anticipate measles vaccination, but there was no statistically significant difference between the proportion of susceptible term and pre-term infants, at any of the four-time points. A larger study is required to confirm these findings and compare sero-protection if vaccination is anticipated to be administered between 6 and 9 months.Keywords: measles, preterm, susceptibility, term infant
Procedia PDF Downloads 27315066 Specification and Unification of All Fundamental Forces Exist in Universe in the Theoretical Perspective – The Universal Mechanics
Authors: Surendra Mund
Abstract:
At the beginning, the physical entity force was defined mathematically by Sir Isaac Newton in his Principia Mathematica as F ⃗=(dp ⃗)/dt in form of his second law of motion. Newton also defines his Universal law of Gravitational force exist in same outstanding book, but at the end of 20th century and beginning of 21st century, we have tried a lot to specify and unify four or five Fundamental forces or Interaction exist in universe, but we failed every time. Usually, Gravity creates problems in this unification every single time, but in my previous papers and presentations, I defined and derived Field and force equations for Gravitational like Interactions for each and every kind of central systems. This force is named as Variational Force by me, and this force is generated by variation in the scalar field density around the body. In this particular paper, at first, I am specifying which type of Interactions are Fundamental in Universal sense (or in all type of central systems or bodies predicted by my N-time Inflationary Model of Universe) and then unify them in Universal framework (defined and derived by me as Universal Mechanics in a separate paper) as well. This will also be valid in Universal dynamical sense which includes inflations and deflations of universe, central system relativity, Universal relativity, ϕ-ψ transformation and transformation of spin, physical perception principle, Generalized Fundamental Dynamical Law and many other important Generalized Principles of Generalized Quantum Mechanics (GQM) and Central System Theory (CST). So, In this article, at first, I am Generalizing some Fundamental Principles, and then Unifying Variational Forces (General form of Gravitation like Interactions) and Flow Generated Force (General form of EM like Interactions), and then Unify all Fundamental Forces by specifying Weak and Strong Interactions in form of more basic terms - Variational, Flow Generated and Transformational Interactions.Keywords: Central System Force, Disturbance Force, Flow Generated Forces, Generalized Nuclear Force, Generalized Weak Interactions, Generalized EM-Like Interactions, Imbalance Force, Spin Generated Forces, Transformation Generated Force, Unified Force, Universal Mechanics, Uniform And Non-Uniform Variational Interactions, Variational Interactions
Procedia PDF Downloads 5015065 A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes
Authors: Tim Giesen, Raphael Adamietz, Pablo Mayer, Philipp Stiefel, Patrick Alle, Dirk Schlenker
Abstract:
The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.Keywords: automation, battery production, carrier, advanced process control, cyber-physical system
Procedia PDF Downloads 33815064 Microstracture of Iranian Processed Cheese
Authors: R. Ezzati, M. Dezyani, H. Mirzaei
Abstract:
The effects of the concentration of trisodium citrate (TSC) emulsifying salt (0.25 to 2.75%) and holding time (0 to 20 min) on the textural, rheological, and microstructural properties of Iranian Processed Cheese Cheddar cheese were studied using a central composite rotatable design. The loss tangent parameter (from small amplitude oscillatory rheology), extent of flow, and melt area (from the Schreiber test) all indicated that the meltability of process cheese decreased with increased concentration of TSC and that holding time led to a slight reduction in meltability. Hardness increased as the concentration of TSC increased. Fluorescence micrographs indicated that the size of fat droplets decreased with an increase in the concentration of TSC and with longer holding times. Acid-base titration curves indicated that the buffering peak at pH 4.8, which is due to residual colloidal calcium phosphate, decreased as the concentration of TSC increased. The soluble phosphate content increased as concentration of TSC increased. However, the insoluble Ca decreased with increasing concentration of TSC. The results of this study suggest that TSC chelated Ca from colloidal calcium phosphate and dispersed casein; the citrate-Ca complex remained trapped within the process cheese matrix. Increasing the concentration of TSC helped to improve fat emulsification and casein dispersion during cooking, both of which probably helped to reinforce the structure of process cheese.Keywords: Iranian processed cheese, cheddar cheese, emulsifying salt, rheology
Procedia PDF Downloads 44415063 Functionality Based Composition of Web Services to Attain Maximum Quality of Service
Authors: M. Mohemmed Sha Mohamed Kunju, Abdalla A. Al-Ameen Abdurahman, T. Manesh Thankappan, A. Mohamed Mustaq Ahmed Hameed
Abstract:
Web service composition is an effective approach to complete the web based tasks with desired quality. A single web service with limited functionality is inadequate to execute a specific task with series of action. So, it is very much required to combine multiple web services with different functionalities to reach the target. Also, it will become more and more challenging, when these services are from different providers with identical functionalities and varying QoS, so while composing the web services, the overall QoS is considered to be the major factor. Also, it is not true that the expected QoS is always attained when the task is completed. A single web service in the composed chain may affect the overall performance of the task. So care should be taken in different aspects such as functionality of the service, while composition. Dynamic and automatic service composition is one of the main option available. But to achieve the actual functionality of the task, quality of the individual web services are also important. Normally the QoS of the individual service can be evaluated by using the non-functional parameters such as response time, throughput, reliability, availability, etc. At the same time, the QoS is not needed to be at the same level for all the composed services. So this paper proposes a framework that allows composing the services in terms of QoS by setting the appropriate weight to the non-functional parameters of each individual web service involved in the task. Experimental results show that the importance given to the non-functional parameter while composition will definitely improve the performance of the web services.Keywords: composition, non-functional parameters, quality of service, web service
Procedia PDF Downloads 33315062 Identifying the Structural Components of Old Buildings from Floor Plans
Authors: Shi-Yu Xu
Abstract:
The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence
Procedia PDF Downloads 8915061 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation
Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta
Abstract:
Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal
Procedia PDF Downloads 32215060 Building Safety Through Real-time Design Fire Protection Systems
Authors: Mohsin Ali Shaikh, Song Weiguo, Muhammad Kashan Surahio, Usman Shahid, Rehmat Karim
Abstract:
When the area of a structure that is threatened by a disaster affects personal safety, the effectiveness of disaster prevention, evacuation, and rescue operations can be summarized by three assessment indicators: personal safety, property preservation, and attribution of responsibility. These indicators are applicable regardless of the disaster that affects the building. People need to get out of the hazardous area and to a safe place as soon as possible because there's no other way to respond. The results of the tragedy are thus closely related to how quickly people are advised to evacuate and how quickly they are rescued. This study considers present fire prevention systems to address catastrophes and improve building safety. It proposes the methods of Prevention Level for Deployment in Advance and Spatial Transformation by Human-Machine Collaboration. We present and prototype a real-time fire protection system architecture for building disaster prevention, evacuation, and rescue operations. The design encourages the use of simulations to check the efficacy of evacuation, rescue, and disaster prevention procedures throughout the planning and design phase of the structure.Keywords: prevention level, building information modeling, quality management system, simulated reality
Procedia PDF Downloads 6915059 From Text to Data: Sentiment Analysis of Presidential Election Political Forums
Authors: Sergio V Davalos, Alison L. Watkins
Abstract:
User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.Keywords: sentiment analysis, text mining, user generated content, US presidential elections
Procedia PDF Downloads 19215058 System of Quality Automation for Documents (SQAD)
Authors: R. Babi Saraswathi, K. Divya, A. Habeebur Rahman, D. B. Hari Prakash, S. Jayanth, T. Kumar, N. Vijayarangan
Abstract:
Document automation is the design of systems and workflows, assembling repetitive documents to meet the specific business needs. In any organization or institution, documenting employee’s information is very important for both employees as well as management. It shows an individual’s progress to the management. Many documents of the employee are in the form of papers, so it is very difficult to arrange and for future reference we need to spend more time in getting the exact document. Also, it is very tedious to generate reports according to our needs. The process gets even more difficult on getting approvals and hence lacks its security aspects. This project overcomes the above-stated issues. By storing the details in the database and maintaining the e-documents, the automation system reduces the manual work to a large extent. Then the approval process of some important documents can be done in a much-secured manner by using Digital Signature and encryption techniques. Details are maintained in the database and e-documents are stored in specific folders and generation of various kinds of reports is possible. Moreover, an efficient search method is implemented is used in the database. Automation supporting document maintenance in many aspects is useful for minimize data entry, reduce the time spent on proof-reading, avoids duplication, and reduce the risks associated with the manual error, etc.Keywords: e-documents, automation, digital signature, encryption
Procedia PDF Downloads 39115057 Entrepreneurship as a Strategy for National Development and Attainment of Millennium Development Goals (MDGs)
Authors: Udokporo Emeka Leonard
Abstract:
The thrust of this paper is to examine how entrepreneurship can assist in the attainment of the first goal among the MDGs – eradication of extreme poverty and hunger in Nigeria. The paper discusses how national development can be driven through employment creation and wealth generation that can lead to reduction in widespread poverty so as to attain one crucial target, in fewer years. The task before Nigeria is certainly a herculean one; it is, in fact a race against time. However, in view of the clear and present danger that the increasing rate of poverty portends for our democracy and our nation, is a race we must; for it is a time bomb on our hands. The paper has been structured into sections; with the introduction as section one. Section two discusses the concept of entrepreneurship; Section three examines the link between entrepreneurship and economic development, while section four examines the challenges facing entrepreneurship in Nigeria. In section five, measures and recommendations to boost entrepreneurship that can drive economic development that translates into poverty reduction and employment creation in Nigeria are suggested. This work is a literature review with some understanding of current trends and situations. It outlines some of the difficulties facing entrepreneurship in Nigeria as the operating environment, inadequate understanding and skewed incentive. It also makes recommendations on possible ways to significantly reduce poverty in 2015.Keywords: development, entrepreneur, Nigeria, poverty
Procedia PDF Downloads 29115056 Effect of Aqueous Enzymatic Extraction Parameters on the Moringa oleifera Oil Yield and Formation of Emulsion
Authors: Masni Mat Yusoff, Michael H. Gordon, Keshavan Niranjan
Abstract:
The study reports on the effect of aqueous enzymatic extraction (AEE) parameters on the Moringa oleifera (MO) oil yield and the formation of emulsion at the end of the process. A mixture of protease and cellulase enzymes was used at 3:1 (w/w) ratio. The highest oil yield of 19% (g oil/g sample) was recovered with the use of a mixture of pH 6, 1:4 material/moisture ratio, and incubation temperature, time, and shaking speed of 50 ⁰C, 12.5 hr, and 300 stroke/min, respectively. The use of pH 6 and 8 resulted in grain emulsions, while solid-intact emulsion was observed at pH 4. Upon fixing certain parameters, higher oil yield was extracted with the use of lower material/moisture ratio and higher shaking speed. Longer incubation time of 24 hr resulted in significantly (p < 0.05) similar oil yield with that of 12.5 hr, and an incubation temperature of 50 ⁰C resulted in significantly (p < 0.05) higher oil yield than that of 60 ⁰C. In overall, each AEE parameter showed significant effects on both the MO oil yields and the emulsions formed. One of the major disadvantages of an AEE process is the formation of emulsions which require further de-emulsification step for higher oil recovery. Therefore, critical studies on the effect of each AEE parameter may assist in minimizing the amount of emulsions formed whilst extracting highest total MO oil yield possible.Keywords: enzyme, emulsion, Moringa oleifera, oil yield
Procedia PDF Downloads 43115055 Synthesis of Polyvinyl Alcohol Encapsulated Ag Nanoparticle Film by Microwave Irradiation for Reduction of P-Nitrophenol
Authors: Supriya, J. K. Basu, S. Sengupta
Abstract:
Silver nanoparticles have caught a lot of attention because of its unique physical and chemical properties. Silver nanoparticles embedded in polyvinyl alcohol (PVA/Ag) free-standing film have been prepared by microwave irradiation in few minutes. PVA performed as a reducing agent, stabilizing agents as well as support for silver nanoparticles. UV-Vis spectrometry, scanning transmission electron (SEM) and transmission electron microscopy (TEM) techniques affirmed the reduction of silver ion to silver nanoparticles in the polymer matrix. Effect of irradiation time, the concentration of PVA and concentration of silver precursor on the synthesis of silver nanoparticle has been studied. Particles size of silver nanoparticles decreases with increase in irradiation time. Concentration of silver nanoparticles increases with increase in concentration of silver precursor. Good dispersion of silver nanoparticles in the film has been confirmed by TEM analysis. Particle size of silver nanoparticle has been found to be in the range of 2-10nm. Catalytic property of prepared silver nanoparticles as a heterogeneous catalyst has been studied in the reduction of p-Nitrophenol (a water pollutant) with >98% conversion. From the experimental results, it can be concluded that PVA encapsulated Ag nanoparticles film as a catalyst shows better efficiency and reusability in the reduction of p-Nitrophenol.Keywords: biopolymer, microwave irradiation, silver nanoparticles, water pollutant
Procedia PDF Downloads 28915054 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing
Authors: Carolina Gouveia, José Vieira, Pedro Pinho
Abstract:
The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR
Procedia PDF Downloads 14115053 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment
Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg
Abstract:
Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.Keywords: building information, modelling, BIM, genetic algorithm, GA, architecture-engineering-construction, AEC, optimisation, structure, design, population, generation, selection, mutation, crossover, offspring
Procedia PDF Downloads 24215052 Liquid-Liquid Extraction of Uranium(vi) from Aqueous Solution Using 1-Hydroxyalkylidene-1,1-Diphosphonic Acids
Authors: M. Bouhoun Ali, A. Y. Badjah Hadj Ahmed, M. Attou, A. Elias, M. A. Didi
Abstract:
The extraction of uranium(VI) from aqueous solutions has been investigated using 1-hydroxyhexadecylidene-1,1-diphosphonic acid (HHDPA) and 1-hydroxydodecylidene-1,1-diphosphonic acid (HDDPA), which were synthesized and characterized by elemental analysis and by FT-IR, 1H NMR, 31P NMR spectroscopy. In this paper, we propose a tentative assignment for the shifts of those two ligands and their specific complexes with uranium(VI). We carried out the extraction of uranium(VI) by HHDPA and HDDPA from [carbon tetrachloride + 2-octanol (v/v: 90%/10%)] solutions. Various factors such as contact time, pH, organic/aqueous phase ratio and extractant concentration were considered. The optimum conditions obtained were: contact time= 20 min, organic/aqueous phase ratio = 1, pH value = 3.0 and extractant concentration = 0.3M. The extraction yields are more significant in the case of the HHDPA which is equipped with a hydrocarbon chain, longer than that of the HDDPA. Logarithmic plots of the uranium(VI) distribution ratio vs. pHeq and the extractant concentration showed that the ratio of extractant to extracted uranium(VI) (ligand/metal) is 2:1. The formula of the complex of uranium(VI) with the HHDPA and the DHDPA is UO2(H3L)2 (HHDPA and DHDPA are denoted as H4L). A spectroscopic analysis has showed that coordination of uranium(VI) takes place via oxygen atoms.Keywords: liquid-liquid extraction, uranium(vi), 1-hydroxyalkylidene-1, 1-diphosphonic acids, hhdpa, hddpa, aqueous solution
Procedia PDF Downloads 26915051 Political Discourse Used in the TV Talk Shows of Pakistani Media
Authors: Hafiz Sajjad Hussain, Asad Razzaq
Abstract:
The study aims to explore the relationship between application of speech and discourse used by the political workers and their leaders for maintaining authoritative approach and dialog power. The representation of these relationships between ideology and language in the analysis of discourse and spoken text following Van Dijk Socio-Cognitive model. Media and political leaders are two pillars of a state and their role is so important for development and effects on the society. Media has become an industry in the recent years in the globe, and especially, the private sector developed a lot in the last decade in Pakistan. Media is the easiest way of communication with the large community in a short time and used discourse independently. The prime time of the news channels in Pakistan presents the political programs on most favorite story or incident of the day. The current program broadcasted by a private channel ARY News July 6, 2014 covered the most top story of the day. The son of Ex. CJ Arslan Iftikhar moves an application to Election Commission of Pakistan about the daughter of the most popular political leader and chairman PTI Imran Khan. This movement turns the whole scenario of the political parties and media got a hot issue form discussion. This study also shows that the ideology and meanings which are presented by the TV channels not always obvious for readers.Keywords: electronic media, political discourse, ideology of media, power, authoritative approach
Procedia PDF Downloads 52915050 Comparative Assessment of the Thermal Tolerance of Spotted Stemborer, Chilo partellus Swinhoe (Lepidoptera: Crambidae) and Its Larval Parasitoid, Cotesia sesamiae Cameron (Hymenoptera: Braconidae)
Authors: Reyard Mutamiswa, Frank Chidawanyika, Casper Nyamukondiwa
Abstract:
Under stressful thermal environments, insects adjust their behaviour and physiology to maintain key life-history activities and improve survival. For interacting species, mutual or antagonistic, thermal stress may affect the participants in differing ways, which may then affect the outcome of the ecological relationship. In agroecosystems, this may be the fate of relationships between insect pests and their antagonistic parasitoids under acute and chronic thermal variability. Against this background, we therefore investigated the thermal tolerance of different developmental stages of Chilo partellus Swinhoe (Lepidoptera: Crambidae) and its larval parasitoid Cotesia sesamiae Cameron (Hymenoptera: Braconidae) using both dynamic and static protocols. In laboratory experiments, we determined lethal temperature assays (upper and lower lethal temperatures) using direct plunge protocols in programmable water baths (Systronix, Scientific, South Africa), effects of ramping rate on critical thermal limits following standardized protocols using insulated double-jacketed chambers (‘organ pipes’) connected to a programmable water bath (Lauda Eco Gold, Lauda DR.R. Wobser GMBH and Co. KG, Germany), supercooling points (SCPs) following dynamic protocols using a Pico logger connected to a programmable water bath, heat knock-down time (HKDT) and chill-coma recovery (CCRT) time following static protocols in climate chambers (HPP 260, Memmert GmbH + Co.KG, Germany) connected to a camera (HD Covert Network Camera, DS-2CD6412FWD-20, Hikvision Digital Technology Co., Ltd, China). When exposed for two hours to a static temperature, lower lethal temperatures ranged -9 to 6; -14 to -2 and -1 to 4ºC while upper lethal temperatures ranged from 37 to 48; 41 to 49 and 36 to 39ºC for C. partellus eggs, larvae and C. sesamiae adults respectively. Faster heating rates improved critical thermal maxima (CTmax) in C. partellus larvae and adult C. partellus and C. sesamiae. Lower cooling rates improved critical thermal minima (CTmin) in C. partellus and C. sesamiae adults while compromising CTmin in C. partellus larvae. The mean SCPs for C. partellus larvae, pupae and adults were -11.82±1.78, -10.43±1.73 and -15.75±2.47 respectively with adults having the lowest SCPs. Heat knock-down time and chill-coma recovery time varied significantly between C. partellus larvae and adults. Larvae had higher HKDT than adults, while the later recovered significantly faster following chill-coma. Current results suggest developmental stage differences in C. partellus thermal tolerance (with respect to lethal temperatures and critical thermal limits) and a compromised temperature tolerance of parasitoid C. sesamiae relative to its host, suggesting potential asynchrony between host-parasitoid population phenology and consequently biocontrol efficacy under global change. These results have broad implications to biological pest management insect-natural enemy interactions under rapidly changing thermal environments.Keywords: chill-coma recovery time, climate change, heat knock-down time, lethal temperatures, supercooling point
Procedia PDF Downloads 23815049 Valuation of Entrepreneurship Education (EE) Curriculum and Self-Employment Generation among Graduates of Tertiary Institutions in Edo State, Nigeria
Authors: Angela Obose Oriazowanlan
Abstract:
Despite the introduction of Entrepreneurship education into the Nigerian University curriculum to prepare graduates for self-employment roles in order to abate employment challenges, their unemployment rate still soars high. The study, therefore, examined the relevance of the curriculum contents and its delivery mechanism to equip graduates with appropriate entrepreneurial skills prior to graduation. Four research questions and two hypotheses guided the study. The survey research design was adopted for the study. An infinite population of graduates of a period of five years with 200 sample representatives using the simple random sampling technique was adopted. A 45-item structured questionnaire was used for data gathering. The gathered data thereof was anlysed using the descriptive statistics of mean and standard deviation, while the formulated hypotheses were tested with Z-score at 0.5 level of significance. The findings revealed, among others, that graduates acquisition of appropriate entrepreneurial skills for self-employment generation is low due to curriculum deficiencies, insufficient time allotment, and the delivery mechanism. It was recommended, among others, that the curriculum should be reviewed to improve its relevancy and that sufficient time should be allotted to enable adequate teaching and learning process.Keywords: evaluation of entrepreneurship education (EE) curriculum, self-employment generation, graduates of tertiary institutions, Edo state, Nigeria
Procedia PDF Downloads 9915048 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 9715047 3D Codes for Unsteady Interaction Problems of Continuous Mechanics in Euler Variables
Authors: M. Abuziarov
Abstract:
The designed complex is intended for the numerical simulation of fast dynamic processes of interaction of heterogeneous environments susceptible to the significant formability. The main challenges in solving such problems are associated with the construction of the numerical meshes. Currently, there are two basic approaches to solve this problem. One is using of Lagrangian or Lagrangian Eulerian grid associated with the boundaries of media and the second is associated with the fixed Eulerian mesh, boundary cells of which cut boundaries of the environment medium and requires the calculation of these cut volumes. Both approaches require the complex grid generators and significant time for preparing the code’s data for simulation. In this codes these problems are solved using two grids, regular fixed and mobile local Euler Lagrange - Eulerian (ALE approach) accompanying the contact and free boundaries, the surfaces of shock waves and phase transitions, and other possible features of solutions, with mutual interpolation of integrated parameters. For modeling of both liquids and gases, and deformable solids the Godunov scheme of increased accuracy is used in Lagrangian - Eulerian variables, the same for the Euler equations and for the Euler- Cauchy, describing the deformation of the solid. The increased accuracy of the scheme is achieved by using 3D spatial time dependent solution of the discontinuity problem (3D space time dependent Riemann's Problem solver). The same solution is used to calculate the interaction at the liquid-solid surface (Fluid Structure Interaction problem). The codes does not require complex 3D mesh generators, only the surfaces of the calculating objects as the STL files created by means of engineering graphics are given by the user, which greatly simplifies the preparing the task and makes it convenient to use directly by the designer at the design stage. The results of the test solutions and applications related to the generation and extension of the detonation and shock waves, loading the constructions are presented.Keywords: fluid structure interaction, Riemann's solver, Euler variables, 3D codes
Procedia PDF Downloads 43915046 Water Ingress into Underground Mine Voids in the Central Rand Goldfields Area, South Africa-Fluid Induced Seismicity
Authors: Artur Cichowicz
Abstract:
The last active mine in the Central Rand Goldfields area (50 km x 15 km) ceased operations in 2008. This resulted in the closure of the pumping stations, which previously maintained the underground water level in the mining voids. As a direct consequence of the water being allowed to flood the mine voids, seismic activity has increased directly beneath the populated area of Johannesburg. Monitoring of seismicity in the area has been on-going for over five years using the network of 17 strong ground motion sensors. The objective of the project is to improve strategies for mine closure. The evolution of the seismicity pattern was investigated in detail. Special attention was given to seismic source parameters such as magnitude, scalar seismic moment and static stress drop. Most events are located within historical mine boundaries. The seismicity pattern shows a strong relationship between the presence of the mining void and high levels of seismicity; no seismicity migration patterns were observed outside the areas of old mining. Seven years after the pumping stopped, the evolution of the seismicity has indicated that the area is not yet in equilibrium. The level of seismicity in the area appears to not be decreasing over time since the number of strong events, with Mw magnitudes above 2, is still as high as it was when monitoring began over five years ago. The average rate of seismic deformation is 1.6x1013 Nm/year. Constant seismic deformation was not observed over the last 5 years. The deviation from the average is in the order of 6x10^13 Nm/year, which is a significant deviation. The variation of cumulative seismic moment indicates that a constant deformation rate model is not suitable. Over the most recent five year period, the total cumulative seismic moment released in the Central Rand Basin was 9.0x10^14 Nm. This is equivalent to one earthquake of magnitude 3.9. This is significantly less than what was experienced during the mining operation. Characterization of seismicity triggered by a rising water level in the area can be achieved through the estimation of source parameters. Static stress drop heavily influences ground motion amplitude, which plays an important role in risk assessments of potential seismic hazards in inhabited areas. The observed static stress drop in this study varied from 0.05 MPa to 10 MPa. It was found that large static stress drops could be associated with both small and large events. The temporal evolution of the inter-event time provides an understanding of the physical mechanisms of earthquake interaction. Changes in the characteristics of the inter-event time are produced when a stress change is applied to a group of faults in the region. Results from this study indicate that the fluid-induced source has a shorter inter-event time in comparison to a random distribution. This behaviour corresponds to a clustering of events, in which short recurrence times tend to be close to each other, forming clusters of events.Keywords: inter-event time, fluid induced seismicity, mine closure, spectral parameters of seismic source
Procedia PDF Downloads 28515045 Impact of Job Crafting on Work Engagement and Well-Being among Indian Working Professionals
Authors: Arjita Jhingran
Abstract:
The pandemic was a turning point for flexible employment. In today’s market, employees prefer companies that provide the autonomy to change their work environment and are flexible. Post pandemic employees have become accustomed to modifying, re-designing, and re-aligning their work environment, task, and the way they interact with co-workers based on their preferences after working from home for a long time. In this scenario, the concept of job crafting has come to the forefront, and research on the subject has expanded, particularly during COVID-19. Managers who provide opportunities to craft the job are driving enhanced engagement and well-being. The current study will aim to examine the impact of job crafting on work engagement and psychological well-being among 385 working professionals, ranging in the age group of 21- 39 years. (M age=30 years). The study will also draw comparisons between freelancers and full-time employees, as freelancers have been considered to have more autonomy over their job. A comparison-based among MNC or startups will be studied; as for the majority of startups, autonomy is a primary motivator. Moreover, a difference based on the level of experience will also be observed, which will add to the body of knowledge. The data will be collected through Job Crafting Questionnaire, Utrecht Work Engagement Scale, and Psychological Well-Being Scale. To infer the findings, correlation analysis will be used to study the relationship among variables, and a Three way ANOVA will be used to draw comparisons.Keywords: job crafting, work engagement, well-being, freelancers, start-ups
Procedia PDF Downloads 10515044 Enhancement of Transaction's Authentication for the Europay, MasterCard, and Visa Contactless Card Payments
Authors: Ossama Al-Maliki
Abstract:
Europay, MasterCard, and Visa (EMV) is one of the most popular payment protocol in the world. The EMV protocol supports Chip and PIN Transactions, Chip and Signature transactions, and Contactless transactions. This protocol suffers from tens of £ millions of lost per year due to many fraudulent payments. This is due to several reported vulnerable points in the protocols used for such payments that allow skimming, replay, cloning, Mole Point of Sale (POS), relay, and other attacks to be conducted. In this paper, we are focusing on the EMV contactless specification and we have proposed two proposal solutions to the addition of a localization factor to enhance the payment authentication of such transactions designed to prevent relay, cloning, and Mole-POS attacks. Our proposed solution is a back-end localization scheme to help the Issuer-Bank compare the location of the genuine cardholder in relation to the used POS. Our scheme uses 'something you have' which is the Cardholder Smartphone (CSP) to provide the location of the cardholder at the time of the transaction and without impacting the contactless payment time/protocol. The Issuer-bank obtain the CSP Location using tried and tested localization techniques, and independently of the cardholder. Both of our proposal solutions do not require infrastructure changes, and it uses existing EMV/SP protocol messages to communicate our scheme information.Keywords: NFC, RFID, contactless card, authentication, location, EMV
Procedia PDF Downloads 24215043 Analysis of Inventory Control, Lot Size and Reorder Point for Engro Polymers and Chemicals
Authors: Ali Akber Jaffri, Asad Naseem, Javeria Khan
Abstract:
The purpose of this study is to determine safety stock, maximum inventory level, reordering point, and reordering quantity by rearranging lot sizes for supplier and customer in MRO (maintenance repair operations) warehouse of Engro Polymers & Chemicals. To achieve the aim, physical analysis method and excel commands were carried out to elicit the customer and supplier data provided by the company. Initially, we rearranged the current lot sizes and MOUs (measure of units) in SAP software. Due to change in lot sizes we have to determine the new quantities for safety stock, maximum inventory, reordering point and reordering quantity as per company's demand. By proposed system, we saved extra cost in terms of reducing the time of receiving from vendor and in issuance to customer, ease of material handling in MRO warehouse and also reduce human efforts. The information requirements identified in this study can be utilized in calculating Economic Order Quantity.Keywords: carrying cost, economic order quantity, fast moving, lead time, lot size, MRO, maximum inventory, ordering cost, physical inspection, reorder point
Procedia PDF Downloads 23915042 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality
Authors: Peregrine James Dalziel, Philip Vu Tran
Abstract:
Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.Keywords: workflow, quality, administration, CT, staffing
Procedia PDF Downloads 11215041 Non-Parametric Changepoint Approximation for Road Devices
Authors: Loïc Warscotte, Jehan Boreux
Abstract:
The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.Keywords: changepoint, weigh-in-motion, process, non-parametric
Procedia PDF Downloads 7815040 Anaerobic Digestion of Coffee Wastewater from a Fast Inoculum Adaptation Stage: Replacement of Complex Substrate
Authors: D. Lepe-Cervantes, E. Leon-Becerril, J. Gomez-Romero, O. Garcia-Depraect, A. Lopez-Lopez
Abstract:
In this study, raw coffee wastewater (CWW) was used as a complex substrate for anaerobic digestion. The inoculum adaptation stage, microbial diversity analysis and biomethane potential (BMP) tests were performed. A fast inoculum adaptation stage was used by the replacement of vinasse to CWW in an anaerobic sequential batch reactor (AnSBR) operated at mesophilic conditions. Illumina MiSeq sequencing was used to analyze the microbial diversity. While, BMP tests using inoculum adapted to CWW were carried out at different inoculum to substrate (I/S) ratios (2:1, 3:1 and 4:1, on a VS basis). Results show that the adaptability percentage was increased gradually until it reaches the highest theoretical value in a short time of 10 d; with a methane yield of 359.10 NmL CH4/g COD-removed; Methanobacterium beijingense was the most abundant microbial (75%) and the greatest specific methane production was achieved at I/S ratio 4:1, whereas the lowest was obtained at 2:1, with BMP values of 320 NmL CH4/g VS and 151 NmL CH4/g VS, respectively. In conclusion, gradual replacement of substrate was a feasible method to adapt the inoculum in a short time even using complex raw substrates, whereas in the BMP tests, the specific methane production was proportional to the initial amount of inoculum.Keywords: anaerobic digestion, biomethane potential test, coffee wastewater, fast inoculum adaptation
Procedia PDF Downloads 38115039 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river
Procedia PDF Downloads 14815038 Geovisualization of Human Mobility Patterns in Los Angeles Using Twitter Data
Authors: Linna Li
Abstract:
The capability to move around places is doubtless very important for individuals to maintain good health and social functions. People’s activities in space and time have long been a research topic in behavioral and socio-economic studies, particularly focusing on the highly dynamic urban environment. By analyzing groups of people who share similar activity patterns, many socio-economic and socio-demographic problems and their relationships with individual behavior preferences can be revealed. Los Angeles, known for its large population, ethnic diversity, cultural mixing, and entertainment industry, faces great transportation challenges such as traffic congestion, parking difficulties, and long commuting. Understanding people’s travel behavior and movement patterns in this metropolis sheds light on potential solutions to complex problems regarding urban mobility. This project visualizes people’s trajectories in Greater Los Angeles (L.A.) Area over a period of two months using Twitter data. A Python script was used to collect georeferenced tweets within the Greater L.A. Area including Ventura, San Bernardino, Riverside, Los Angeles, and Orange counties. Information associated with tweets includes text, time, location, and user ID. Information associated with users includes name, the number of followers, etc. Both aggregated and individual activity patterns are demonstrated using various geovisualization techniques. Locations of individual Twitter users were aggregated to create a surface of activity hot spots at different time instants using kernel density estimation, which shows the dynamic flow of people’s movement throughout the metropolis in a twenty-four-hour cycle. In the 3D geovisualization interface, the z-axis indicates time that covers 24 hours, and the x-y plane shows the geographic space of the city. Any two points on the z axis can be selected for displaying activity density surface within a particular time period. In addition, daily trajectories of Twitter users were created using space-time paths that show the continuous movement of individuals throughout the day. When a personal trajectory is overlaid on top of ancillary layers including land use and road networks in 3D visualization, the vivid representation of a realistic view of the urban environment boosts situational awareness of the map reader. A comparison of the same individual’s paths on different days shows some regular patterns on weekdays for some Twitter users, but for some other users, their daily trajectories are more irregular and sporadic. This research makes contributions in two major areas: geovisualization of spatial footprints to understand travel behavior using the big data approach and dynamic representation of activity space in the Greater Los Angeles Area. Unlike traditional travel surveys, social media (e.g., Twitter) provides an inexpensive way of data collection on spatio-temporal footprints. The visualization techniques used in this project are also valuable for analyzing other spatio-temporal data in the exploratory stage, thus leading to informed decisions about generating and testing hypotheses for further investigation. The next step of this research is to separate users into different groups based on gender/ethnic origin and compare their daily trajectory patterns.Keywords: geovisualization, human mobility pattern, Los Angeles, social media
Procedia PDF Downloads 119