Search results for: streaming analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 418

Search results for: streaming analytics

148 Twitter Sentiment Analysis during the Lockdown on New-Zealand

Authors: Smah Almotiri

Abstract:

One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2020, until April 4, 2020. Natural language processing (NLP), which is a form of Artificial intelligence, was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applying machine learning sentimental methods such as Crystal Feel and extending the size of the sample tweet by using multiple tweets over a longer period of time.

Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS

Procedia PDF Downloads 158
147 Design and Evaluation of Production Performance Dashboard for Achieving Oil and Gas Production Target

Authors: Ivan Ramos Sampe Immanuel, Linung Kresno Adikusumo, Liston Sitanggang

Abstract:

Achieving the production targets of oil and gas in an upstream oil and gas company represents a complex undertaking necessitating collaborative engagement from a multidisciplinary team. In addition to conducting exploration activities and executing well intervention programs, an upstream oil and gas enterprise must assess the feasibility of attaining predetermined production goals. The monitoring of production performance serves as a critical activity to ensure organizational progress towards the established oil and gas performance targets. Subsequently, decisions within the upstream oil and gas management team are informed by the received information pertaining to the respective production performance. To augment the decision-making process, the implementation of a production performance dashboard emerges as a viable solution, providing an integrated and centralized tool. The deployment of a production performance dashboard manifests as an instrumental mechanism fostering a user-friendly interface for monitoring production performance, while concurrently preserving the intrinsic characteristics of granular data. The integration of diverse data sources into a unified production performance dashboard establishes a singular veritable source, thereby enhancing the organization's capacity to uphold a consolidated and authoritative foundation for its business requisites. Additionally, the heightened accessibility of the production performance dashboard to business users constitutes a compelling substantiation of its consequential impact on facilitating the monitoring of organizational targets.

Keywords: production, performance, dashboard, data analytics

Procedia PDF Downloads 36
146 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 43
145 Powering Connections: Synergizing Sales and Marketing for Electronics Engineering with Web Development.

Authors: Muhammad Awais Kiani, Abdul Basit Kiani, Maryam Kiani

Abstract:

Synergizing Sales and Marketing for Electronics Engineering with Web Development, explores the dynamic relationship between sales, marketing, and web development within the electronics engineering industry. This study is important for the power of digital platforms to connect with customers. Which increases brand visibility and drives sales. It highlights the need for collaboration between sales and marketing teams, as well as the integration of web development strategies to create seamless user experiences and effective lead generation. Furthermore, It also emphasizes the role of data analytics and customer insights in optimizing sales and marketing efforts in the ever-evolving landscape of electronics engineering. Sales and marketing play a crucial role in driving business growth, and in today's digital landscape, web development has become an integral part of these strategies. Web development enables businesses to create visually appealing and user-friendly websites that effectively showcase their products or services. It allows for the integration of e-commerce functionalities, enabling seamless online transactions. Furthermore, web development helps businesses optimize their online presence through search engine optimization (SEO) techniques, social media integration, and content management systems. This abstract highlights the symbiotic relationship between sales marketing in the electronics industry and web development, emphasizing the importance of a strong online presence in achieving business success.

Keywords: electronics industry, web development, sales, marketing

Procedia PDF Downloads 84
144 Enhancing Audience Engagement: Informal Music Learning During Classical Concerts

Authors: Linda Dusman, Linda Baker

Abstract:

The Bearman Study of Audience Engagement examined the potential for real-time music education during online symphony orchestra concerts. It follows on the promising results of a preliminary study of STEAM (Science, Technology, Engineering, Arts, and Mathematics) education during live concerts, funded by the National Science Foundation with the Baltimore Symphony Orchestra. For the Bearman Study, audience groups were recruited to attend two previously recorded concerts of the National Orchestral Institute (NOI) in 2020 or the Utah Symphony in 2021. They used a smartphone app called EnCue to present real-time program notes about the music being performed. Short notes along with visual information (photos and score fragments) were designed to provide historical, cultural, biographical, and theoretical information at specific moments in the music where that information would be most pertinent, generally spaced 2-3 minutes apart to avoid distraction. The music performed included Dvorak Symphony No. 8 and Mahler Symphony No. 5 at NOI, and Mendelssohn Scottish Symphony and Richard Strauss Metamorphosen with the Utah Symphony, all standard repertoire for symphony orchestras. During each phase of the study (2020 and 2021), participants were randomly assigned to use the app to view program notes during the first concert or to use the app during the second concert. A total of 139 participants (67 in 2020 and 72 in 2021) completed three online questionnaires, one before attending the first concert, one immediately after the concert, and the third immediately after the second concert. Questionnaires assessed demographic background, expertise in music, engagement during the concert, learning of content about the composers and the symphonies, and interest in the future use of the app. In both phases of the study, participants demonstrated that they learned content presented on the app, evidenced by the fact that their multiple-choice test scores were significantly higher when they used the app than when they did not. In addition, most participants indicated that using the app enriched their experience of the concert. Overall, they were very positive about their experience using the app for real-time learning and they expressed interest in using it in the future at both live and streaming concerts. Results confirmed that informal real-time learning during concerts is possible and can generate enhanced engagement and interest in classical music.

Keywords: audience engagement, informal education, music technology, real-time learning

Procedia PDF Downloads 179
143 Leveraging Artificial Intelligence to Analyze the Interplay between Social Vulnerability Index and Mobility Dynamics in Pandemics

Authors: Joshua Harrell, Gideon Osei Bonsu, Susan Garza, Clarence Conner, Da’Neisha Harris, Emma Bukoswki, Zohreh Safari

Abstract:

The Social Vulnerability Index (SVI) stands as a pivotal tool for gauging community resilience amidst diverse stressors, including pandemics like COVID-19. This paper synthesizes recent research and underscores the significance of SVI in elucidating the differential impacts of crises on communities. Drawing on studies by Fox et al. (2023) and Mah et al. (2023), we delve into the application of SVI alongside emerging data sources to uncover nuanced insights into community vulnerability. Specifically, we explore the utilization of SVI in conjunction with mobility data from platforms like SafeGraph to probe the intricate relationship between social vulnerability and mobility dynamics during the COVID-19 pandemic. By leveraging 16 community variables derived from the American Community Survey, including socioeconomic status and demographic characteristics, SVI offers actionable intelligence for guiding targeted interventions and resource allocation. Building upon recent advancements, this paper contributes to the discourse on harnessing AI techniques to mitigate health disparities and fortify public health resilience in the face of pandemics and other crises.

Keywords: social vulnerability index, mobility dynamics, data analytics, health equity, pandemic preparedness, targeted interventions, data integration

Procedia PDF Downloads 38
142 Digital Innovation and Business Transformation

Authors: Bisola Stella Sonde

Abstract:

Digital innovation has emerged as a pivotal driver of business transformation in the contemporary landscape. This case study research explores the dynamic interplay between digital innovation and the profound metamorphosis of businesses across industries. It delves into the multifaceted dimensions of digital innovation, elucidating its impact on organizational structures, customer experiences, and operational paradigms. The study investigates real-world instances of businesses harnessing digital technologies to enhance their competitiveness, agility, and sustainability. It scrutinizes the strategic adoption of digital platforms, data analytics, artificial intelligence, and emerging technologies as catalysts for transformative change. The cases encompass a diverse spectrum of industries, spanning from traditional enterprises to disruptive startups, offering insights into the universal relevance of digital innovation. Moreover, the research scrutinizes the challenges and opportunities posed by the digital era, shedding light on the intricacies of managing cultural shifts, data privacy, and cybersecurity concerns in the pursuit of innovation. It unveils the strategies that organizations employ to adapt, thrive, and lead in the era of digital disruption. In summary, this case study research underscores the imperative of embracing digital innovation as a cornerstone of business transformation. It offers a comprehensive exploration of the contemporary digital landscape, offering valuable lessons for organizations striving to navigate the ever-evolving terrain of the digital age.

Keywords: business transformation, digital innovation, emerging technologies, organizational structures

Procedia PDF Downloads 33
141 Particle Deflection in a PDMS Microchannel Caused by a Plane Travelling Surface Acoustic Wave

Authors: Florian Keipert, Hagen Schmitd

Abstract:

The size selective separation of different species in a microfluidic system is an actual task in biological or medical research. Former works dealt with the utilisation of the acoustic radiation force (ARF) caused by a plane travelling Surface Acoustic Wave (tSAW). In literature the ARF is described by a dimensionless parameter κ, depending on the wavelength and the particle diameter. To our knowledge research was done for values 0.2 < κ < 5.8 showing that the ARF is dominating the acoustic streaming force (ASF) for κ > 1.2. As a consequence the particle separation is limited by κ. In addition the dependence on the electrical power level was examined but only for κ > 1 pointing out an increased particle deflection for higher electrical power levels. Nevertheless a detailed study on the ASF and ARF especially for κ < 1 is still missing. In our setup we used a tSAW with a wavelength λ = 90 µm and 3 µm PS particles corresponding to κ = 0.3. Herewith the influence of the applied electrical power level on the particle deflection in a polydimethylsiloxan micro channel was investigated. Our results show an increased particle deflection for an increased electrical power level, which coincides with the reported results for κ > 1. Therefore particle separation is in contrast to literature also possible for lower κ values. Thereby the experimental setup can be generally simplified by a coordinated electrical power level for the specific particle size. Furthermore this raises the question of whether this particle deflection is caused only by the ARF as adopted so far or by the ASF or the sum of both forces. To investigate this fact a 0% - 24% saline solution was used and thus the mismatch between the compressibility of the PS particle and the working fluid could be changed. Therefore it is possible to change the relative strength between ARF and ASF and consequently the particle deflection. We observed a decreasing in the particle deflection for an increased NaCl content up to a 12% saline solution and subsequently an increasing of the particle deflection. Our observation could be explained by the acoustic contrast factor Φ, which depends on the compressibility mismatch. The compressibility of water is increased by the NaCl and the range of a 0% - 24% saline solution covers the PS particle compressibility. Hence the particle deflection reaches a minimum value for the accordance between compressibility of PS particle and saline solution. This minimum value can be estimated as the particle deflection only caused by the ASF. Knowing the particle deflection due to the ASF the particle deflection caused by the ARF can be calculated and thus finally the relation between both forces. Concluding, the particle deflection and therefore the size selective particle separation generated by a tSAW can be achieved for values κ < 1, simplifying actual setups by adjusting the electrical power level. Beyond we studied for the first time the relative strength between ARF and ASF to characterise the particle deflection in a microchannel.

Keywords: ARF, ASF, particle separation, saline solution, tSAW

Procedia PDF Downloads 236
140 Building a Transformative Continuing Professional Development Experience for Educators through a Principle-Based, Technological-Driven Knowledge Building Approach: A Case Study of a Professional Learning Team in Secondary Education

Authors: Melvin Chan, Chew Lee Teo

Abstract:

There has been a growing emphasis in elevating the teachers’ proficiency and competencies through continuing professional development (CPD) opportunities. In this era of a Volatile, Uncertain, Complex, Ambiguous (VUCA) world, teachers are expected to be collaborative designers, critical thinkers and creative builders. However, many of the CPD structures are still revolving in the model of transmission, which stands in contradiction to the cultivation of future-ready teachers for the innovative world of emerging technologies. This article puts forward the framing of CPD through a Principle-Based, Technological-Driven Knowledge Building Approach grounded in the essence of andragogy and progressive learning theories where growth is best exemplified through an authentic immersion in a social/community experience-based setting. Putting this Knowledge Building Professional Development Model (KBPDM) in operation via a Professional Learning Team (PLT) situated in a Secondary School in Singapore, research findings reveal that the intervention has led to a fundamental change in the learning paradigm of the teachers, henceforth equipping and empowering them successfully in their pedagogical design and practices for a 21st century classroom experience. This article concludes with the possibility in leveraging the Learning Analytics to deepen the CPD experiences for educators.

Keywords: continual professional development, knowledge building, learning paradigm, principle-based

Procedia PDF Downloads 109
139 Solomon Islands Decentralization Efforts

Authors: Samson Viulu, Hugo Hebala, Duddley Kopu

Abstract:

Constituency Development Fund (CDF) is a controversial fund that has existed in the Solomon Islands since the early 90s to date. It is largely controversial because it is directly handled by members of parliament (MPs) of the Solomon Islands legislation chamber. It is commonly described as a political slash fund because only voters of MPs benefit from it to retain loyalty. The CDF was established by a legislative act in 2013; however, it does not have any subsidiary regulations to it, therefore, very weak governance. CDF is purposely to establish development projects in the rural areas of the Solomon Islands to spur economic growth. Although almost USD500M was spent in CDF in the last decade, there has been no growth in the economy of the Solomon Islands; rather, a regress. Solomon Islands has now formulated a first home-grown policy aimed at guiding the overall development of the fifty constituencies, improving delivery mechanisms of the CDF, and strengthening its governance through the regulation of the CDF Act 2013. The Solomon Islands Constituency Development Policy is the first for the country since gaining independence in 1978 and gives strong emphasis on a cross-sectoral approach through effective partnerships and collaborations and decentralizing government services to the isolated rural areas of the country. The new policy is driving the efforts of the political government to decentralize government services to isolated rural communities to encourage the participation of rural dwellers in economic activities. The decentralization will see the establishment of constituency offices within all constituencies and the piloting of townships in constituencies that have met the statutory requirements of the state. It also encourages constituencies to become development agents of the national government than being mere political boundaries. The decentralization will go in line with the establishment of the Solomon Islands Special Economic Zones (SEZ), where investors will be given special privileges and exemptions from government taxes and permits to attract tangible development to occur in rural constituencies. The design and formulation of the new development policy are supported by the UNDP office in the Solomon Islands. The new policy is promoting a reorientation on the allocation of resources more toward the productive and resource sectors, making access to finance easier for entrepreneurs and encouraging growth in rural entrepreneurship in the fields of agriculture, fisheries, down streaming, and tourism across the Solomon Islands. This new policy approach will greatly assist the country to graduate from the least developed countries status in a few years’ time.

Keywords: decentralization, constituency development fund, Solomon Islands constituency development policy, partnership, entrepreneurship

Procedia PDF Downloads 50
138 Duality of Leagility and Governance: A New Normal Demand Network Management Paradigm under Pandemic

Authors: Jacky Hau

Abstract:

The prevalence of emerging technologies disrupts various industries as well as consumer behavior. Data collection has been in the fingertip and inherited through enabled Internet-of-things (IOT) devices. Big data analytics (BDA) becomes possible and allows real-time demand network management (DNM) through leagile supply chain. To enhance further on its resilience and predictability, governance is going to be examined to promote supply chain transparency and trust in an efficient manner. Leagility combines lean thinking and agile techniques in supply chain management. It aims at reducing costs and waste, as well as maintaining responsiveness to any volatile consumer demand by means of adjusting the decoupling point where the product flow changes from push to pull. Leagility would only be successful when collaborative planning, forecasting, and replenishment (CPFR) process or alike is in place throughout the supply chain business entities. Governance and procurement of the supply chain, however, is crucial and challenging for the execution of CPFR as every entity has to walk-the-talk generously for the sake of overall benefits of supply chain performance, not to mention the complexity of exercising the polices at both of within across various supply chain business entities on account of organizational behavior and mutual trust. Empirical survey results showed that the effective timespan on demand forecasting had been drastically shortening in the magnitude of months to weeks planning horizon, thus agility shall come first and preferably following by lean approach in a timely manner.

Keywords: governance, leagility, procure-to-pay, source-to-contract

Procedia PDF Downloads 90
137 Comprehensive Study of Data Science

Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly

Abstract:

Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.

Keywords: data science, machine learning, data analytics, artificial intelligence

Procedia PDF Downloads 50
136 From Ride-Hailing App to Diversified and Sustainable Platform Business Model

Authors: Ridwan Dewayanto Rusli

Abstract:

We show how prisoner's dilemma-type competition problems can be mitigated through rapid platform diversification and ecosystem expansion. We analyze a ride-hailing company in Southeast Asia, Gojek, whose network grew to more than 170 million users comprising consumers, partner drivers, merchants, and complementors within a few years and has already achieved higher contribution margins than ride-hailing peers Uber and Lyft. Its ecosystem integrates ride-hailing, food delivery and logistics, merchant solutions, e-commerce, marketplace and advertising, payments, and fintech offerings. The company continues growing its network of complementors and App developers, expanding content and gaining critical mass in consumer data analytics and advertising. We compare the company's growth and diversification trajectory with those of its main international rivals and peers. The company's rapid growth and future potential are analyzed using Cusumano's (2012) Staying Power and Six Principles, Hax and Wilde's (2003) and Hax's (2010) The Delta Model as well as Santos' (2016) home-market advantages frameworks. The recently announced multi-billion-dollar merger with one of Southeast Asia's largest e-commerce majors lends additional support to the above arguments.

Keywords: ride-hailing, prisoner's dilemma, platform and ecosystem strategy, digital applications, diversification, home market advantages, e-commerce

Procedia PDF Downloads 73
135 Seasonal Variations, Environmental Parameters, and Standing Crop Assessment of Benthic Foraminifera in Western Bahrain, Arabian Gulf

Authors: Muhammad Arslan, Michael A. Kaminski, Bassam S. Tawabini, Fabrizio Frontalini

Abstract:

We conducted a survey of living benthic foraminifera in a relatively unpolluted site of Bahrain in the Arabian Gulf, with the aim of determining the seasonal variability in their populations, as well as various environmental parameters that affect their distribution. The maximum standing crop was observed during winter, with highest population of rotaliids, followed by a peak in miliolids. The highest population is attributed to an increasing number juveniles observed along the depth transect. A strong correlation between sediment grain size and the foraminiferal population indicates that juveniles were most abundant on coarser sandy substrate and less abundant on fine substrate. In spring, the total living population decreased, and lowest values are observed in the summer. The population started to increase again in the autumn with highest juveniles/adult ratios. Moreover, results of relative abundance and species consistency show that Ammonia is found to be consistent from the shallowest to the deepest station, whereas miliolids start appearing in the deeper stations. The average numbers of Peneroplis and Elphidium also increases along the depth transect. Environmental characterization reveals that although the site is subjected to eutrophication caused by nitrates and sulfates, pollution caused by hydrocarbons and heavy metals is not significant. The assessment of 63 heavy metals showed that none of the metals had concentrations that exceed internationally accepted norms [the devised level of Effect Range-Low], with the exception of strontium. The lack of a significant environmental effect of heavy metals is confirmed by a Foraminiferal Deformities Index value of less than 2%. Likewise, no hydrocarbon contamination was detected in the water or sediment samples. Lastly, observations of cytoplasmic streaming and pseudopodial activity in Petri dishes suggest that the foraminiferal population is not stressed. We conclude that the site in Bahrain is not yet adversely affected by human development, and therefore can provide baseline information for future comparison and assessment of foraminiferal assemblages in contaminated zones of the Arabian Gulf.

Keywords: Arabian Gulf, benthic foraminifera, standing crop, Western Bahrain

Procedia PDF Downloads 621
134 Mining User-Generated Contents to Detect Service Failures with Topic Model

Authors: Kyung Bae Park, Sung Ho Ha

Abstract:

Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.

Keywords: latent dirichlet allocation, R program, text mining, topic model, user generated contents, visualization

Procedia PDF Downloads 164
133 Programmable Microfluidic Device Based on Stimuli Responsive Hydrogels

Authors: Martin Elstner

Abstract:

Processing of information by means of handling chemicals is a ubiquitous phenomenon in nature. Technical implementations of chemical information processing lack of low integration densities compared to electronic devices. Stimuli responsive hydrogels are promising candidates for materials with information processing capabilities. These hydrogels are sensitive toward chemical stimuli like metal ions or amino acids. The binding of an analyte molecule induces conformational changes inside the polymer network and subsequently the water content and volume of the hydrogel varies. This volume change can control material flows, and concurrently information flows, in microfluidic devices. The combination of this technology with powerful chemical logic gates yields in a platform for highly integrated chemical circuits. The manufacturing process of such devices is very challenging and rapid prototyping is a key technology used in the study. 3D printing allows generating three-dimensional defined structures of high complexity in a single and fast process step. This thermoplastic master is molded into PDMS and the master is removed by dissolution in an organic solvent. A variety of hydrogel materials is prepared by dispenser printing of pre-polymer solutions. By a variation of functional groups or cross-linking units, the functionality of the hole circuit can be programmed. Finally, applications in the field of bio-molecular analytics were demonstrated with an autonomously operating microfluidic chip.

Keywords: bioanalytics, hydrogels, information processing, microvalve

Procedia PDF Downloads 284
132 An Evaluation of Existing Models to Smart Cities Development Around the World

Authors: Aqsa Mehmood, Muhammad Ali Tahir, Hafiz Syed Hamid Arshad, Salman Atif, Ejaz Hussain, Gavin McArdle, Michela Bertolotto

Abstract:

The evolution of smart cities in recent years has been developing dramatically. As urbanization increases, the demand for big data analytics and digital technology-based solutions for cities has also increased. Many cities around the world have now planned to focus on smart cities. To obtain a systematic overview of smart city models, we carried out a bibliometric analysis in the context of seven regions of the world to understand the main dimensions that characterize smart cities. This paper analyses articles published between 2017 and 2021 that were captured from Web of Science and Scopus. Specifically, we investigated publication trends to highlight the research gaps and current developments in smart cities research. Our survey provides helpful insights into the geographical distribution of smart city publications with respect to regions of the world and explores the current key topics relevant to smart cities and the co-occurrences of keywords used in these publications. A systematic literature review and keyword analysis were performed. The results have focused on identifying future directions in smart city development, including smart citizens, ISO standards, Open Geospatial Consortium and the sustainability factor of smart cities. This article will assist researchers and urban planners in understanding the latest trends in research and highlight the aspects which need further attention.

Keywords: smart cities, sustainability, regions, urban development, VOS viewer, research trends

Procedia PDF Downloads 76
131 Application of Lattice Boltzmann Method to Different Boundary Conditions in a Two Dimensional Enclosure

Authors: Jean Yves Trepanier, Sami Ammar, Sagnik Banik

Abstract:

Lattice Boltzmann Method has been advantageous in simulating complex boundary conditions and solving for fluid flow parameters by streaming and collision processes. This paper includes the study of three different test cases in a confined domain using the method of the Lattice Boltzmann model. 1. An SRT (Single Relaxation Time) approach in the Lattice Boltzmann model is used to simulate Lid Driven Cavity flow for different Reynolds Number (100, 400 and 1000) with a domain aspect ratio of 1, i.e., square cavity. A moment-based boundary condition is used for more accurate results. 2. A Thermal Lattice BGK (Bhatnagar-Gross-Krook) Model is developed for the Rayleigh Benard convection for both test cases - Horizontal and Vertical Temperature difference, considered separately for a Boussinesq incompressible fluid. The Rayleigh number is varied for both the test cases (10^3 ≤ Ra ≤ 10^6) keeping the Prandtl number at 0.71. A stability criteria with a precise forcing scheme is used for a greater level of accuracy. 3. The phase change problem governed by the heat-conduction equation is studied using the enthalpy based Lattice Boltzmann Model with a single iteration for each time step, thus reducing the computational time. A double distribution function approach with D2Q9 (density) model and D2Q5 (temperature) model are used for two different test cases-the conduction dominated melting and the convection dominated melting. The solidification process is also simulated using the enthalpy based method with a single distribution function using the D2Q5 model to provide a better understanding of the heat transport phenomenon. The domain for the test cases has an aspect ratio of 2 with some exceptions for a square cavity. An approximate velocity scale is chosen to ensure that the simulations are within the incompressible regime. Different parameters like velocities, temperature, Nusselt number, etc. are calculated for a comparative study with the existing works of literature. The simulated results demonstrate excellent agreement with the existing benchmark solution within an error limit of ± 0.05 implicates the viability of this method for complex fluid flow problems.

Keywords: BGK, Nusselt, Prandtl, Rayleigh, SRT

Procedia PDF Downloads 105
130 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink

Authors: Sanjay Rathee, Arti Kashyap

Abstract:

Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.

Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining

Procedia PDF Downloads 261
129 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers

Authors: Pietro D'Ambrosio, Roberta D'Ambrosio

Abstract:

The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.

Keywords: evaluation, methodology, restoration, reuse

Procedia PDF Downloads 152
128 Detailed Analysis of Multi-Mode Optical Fiber Infrastructures for Data Centers

Authors: Matej Komanec, Jan Bohata, Stanislav Zvanovec, Tomas Nemecek, Jan Broucek, Josef Beran

Abstract:

With the exponential growth of social networks, video streaming and increasing demands on data rates, the number of newly built data centers rises proportionately. The data centers, however, have to adjust to the rapidly increased amount of data that has to be processed. For this purpose, multi-mode (MM) fiber based infrastructures are often employed. It stems from the fact, the connections in data centers are typically realized within a short distance, and the application of MM fibers and components considerably reduces costs. On the other hand, the usage of MM components brings specific requirements for installation service conditions. Moreover, it has to be taken into account that MM fiber components have a higher production tolerance for parameters like core and cladding diameters, eccentricity, etc. Due to the high demands for the reliability of data center components, the determination of properly excited optical field inside the MM fiber core belongs to the key parameters while designing such an MM optical system architecture. Appropriately excited mode field of the MM fiber provides optimal power budget in connections, leads to the decrease of insertion losses (IL) and achieves effective modal bandwidth (EMB). The main parameter, in this case, is the encircled flux (EF), which should be properly defined for variable optical sources and consequent different mode-field distribution. In this paper, we present detailed investigation and measurements of the mode field distribution for short MM links purposed in particular for data centers with the emphasis on reliability and safety. These measurements are essential for large MM network design. The various scenarios, containing different fibers and connectors, were tested in terms of IL and mode-field distribution to reveal potential challenges. Furthermore, we focused on estimation of particular defects and errors, which can realistically occur like eccentricity, connector shifting or dust, were simulated and measured, and their dependence to EF statistics and functionality of data center infrastructure was evaluated. The experimental tests were performed at two wavelengths, commonly used in MM networks, of 850 nm and 1310 nm to verify EF statistics. Finally, we provide recommendations for data center systems and networks, using OM3 and OM4 MM fiber connections.

Keywords: optical fiber, multi-mode, data centers, encircled flux

Procedia PDF Downloads 349
127 A Corpus-Based Approach to Understanding Market Access in Fisheries and Aquaculture: A Systematic Literature Review

Authors: Cheryl Marie Cordeiro

Abstract:

Although fisheries and aquaculture studies might seem marginal to international business (IB) studies in general, fisheries and aquaculture IB (FAIB) management is currently facing increasing pressure to meet global demand and consumption for fish in the next coming decades. In part address to this challenge, the purpose of this systematic review of literature (SLR) study is to investigate the use of the term ‘market access’ in its context of use in the generic literature and business sector discourse, in comparison to the more specific literature and discourse in fisheries, aquaculture and seafood. This SLR aims to uncover the knowledge/interest gaps between the academic subject discourses and business sector practices. Corpus driven in methodology and using a triangulation method of three different text analysis software including AntConc, VOSviewer and Web of Science (WoS) analytics, the SLR results indicate a gap in conceptual knowledge and business practices in how ‘market access’ is conceived and used in the context of the pharmaceutical healthcare industry and FAIB research and practice. While it is acknowledged that the product orientation of different business sectors might differ, this SLR study works with the assumption that both business sectors are global in orientation. These business sectors are complex in their operations from product to market. This SLR suggests a conceptual model in understanding the challenges, the potential barriers as well as avenues for solutions to developing market access for FAIB.

Keywords: market access, fisheries and aquaculture, international business, systematic literature review

Procedia PDF Downloads 125
126 A Machine Learning Approach for Performance Prediction Based on User Behavioral Factors in E-Learning Environments

Authors: Naduni Ranasinghe

Abstract:

E-learning environments are getting more popular than any other due to the impact of COVID19. Even though e-learning is one of the best solutions for the teaching-learning process in the academic process, it’s not without major challenges. Nowadays, machine learning approaches are utilized in the analysis of how behavioral factors lead to better adoption and how they related to better performance of the students in eLearning environments. During the pandemic, we realized the academic process in the eLearning approach had a major issue, especially for the performance of the students. Therefore, an approach that investigates student behaviors in eLearning environments using a data-intensive machine learning approach is appreciated. A hybrid approach was used to understand how each previously told variables are related to the other. A more quantitative approach was used referred to literature to understand the weights of each factor for adoption and in terms of performance. The data set was collected from previously done research to help the training and testing process in ML. Special attention was made to incorporating different dimensionality of the data to understand the dependency levels of each. Five independent variables out of twelve variables were chosen based on their impact on the dependent variable, and by considering the descriptive statistics, out of three models developed (Random Forest classifier, SVM, and Decision tree classifier), random forest Classifier (Accuracy – 0.8542) gave the highest value for accuracy. Overall, this work met its goals of improving student performance by identifying students who are at-risk and dropout, emphasizing the necessity of using both static and dynamic data.

Keywords: academic performance prediction, e learning, learning analytics, machine learning, predictive model

Procedia PDF Downloads 120
125 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 105
124 Critically Analyzing the Application of Big Data for Smart Transportation: A Case Study of Mumbai

Authors: Tanuj Joshi

Abstract:

Smart transportation is fast emerging as a solution to modern cities’ approach mobility issues, delayed emergency response rate and high congestion on streets. Present day scenario with Google Maps, Waze, Yelp etc. demonstrates how information and communications technologies controls the intelligent transportation system. This intangible and invisible infrastructure is largely guided by the big data analytics. On the other side, the exponential increase in Indian urban population has intensified the demand for better services and infrastructure to satisfy the transportation needs of its citizens. No doubt, India’s huge internet usage is looked as an important resource to guide to achieve this. However, with a projected number of over 40 billion objects connected to the Internet by 2025, the need for systems to handle massive volume of data (big data) also arises. This research paper attempts to identify the ways of exploiting the big data variables which will aid commuters on Indian tracks. This study explores real life inputs by conducting survey and interviews to identify which gaps need to be targeted to better satisfy the customers. Several experts at Mumbai Metropolitan Region Development Authority (MMRDA), Mumbai Metro and Brihanmumbai Electric Supply and Transport (BEST) were interviewed regarding the Information Technology (IT) systems currently in use. The interviews give relevant insights and requirements into the workings of public transportation systems whereas the survey investigates the macro situation.

Keywords: smart transportation, mobility issue, Mumbai transportation, big data, data analysis

Procedia PDF Downloads 149
123 Enhancing the Pricing Expertise of an Online Distribution Channel

Authors: Luis N. Pereira, Marco P. Carrasco

Abstract:

Dynamic pricing is a revenue management strategy in which hotel suppliers define, over time, flexible and different prices for their services for different potential customers, considering the profile of e-consumers and the demand and market supply. This means that the fundamentals of dynamic pricing are based on economic theory (price elasticity of demand) and market segmentation. This study aims to define a dynamic pricing strategy and a contextualized offer to the e-consumers profile in order to improve the number of reservations of an online distribution channel. Segmentation methods (hierarchical and non-hierarchical) were used to identify and validate an optimal number of market segments. A profile of the market segments was studied, considering the characteristics of the e-consumers and the probability of reservation a room. In addition, the price elasticity of demand was estimated for each segment using econometric models. Finally, predictive models were used to define rules for classifying new e-consumers into pre-defined segments. The empirical study illustrates how it is possible to improve the intelligence of an online distribution channel system through an optimal dynamic pricing strategy and a contextualized offer to the profile of each new e-consumer. A database of 11 million e-consumers of an online distribution channel was used in this study. The results suggest that an appropriate policy of market segmentation in using of online reservation systems is benefit for the service suppliers because it brings high probability of reservation and generates more profit than fixed pricing.

Keywords: dynamic pricing, e-consumers segmentation, online reservation systems, predictive analytics

Procedia PDF Downloads 208
122 Predicting the Next Offensive Play Types will be Implemented to Maximize the Defense’s Chances of Success in the National Football League

Authors: Chris Schoborg, Morgan C. Wang

Abstract:

In the realm of the National Football League (NFL), substantial dedication of time and effort is invested by both players and coaches in meticulously analyzing the game footage of their opponents. The primary aim is to anticipate the actions of the opposing team. Defensive players and coaches are especially focused on deciphering their adversaries' intentions to effectively counter their strategies. Acquiring insights into the specific play type and its intended direction on the field would confer a significant competitive advantage. This study establishes pre-snap information as the cornerstone for predicting both the play type (e.g., deep pass, short pass, or run) and its spatial trajectory (right, left, or center). The dataset for this research spans the regular NFL season data for all 32 teams from 2013 to 2022. This dataset is acquired using the nflreadr package, which conveniently extracts play-by-play data from NFL games and imports it into the R environment as structured datasets. In this study, we employ a recently developed machine learning algorithm, XGBoost. The final predictive model achieves an impressive lift of 2.61. This signifies that the presented model is 2.61 times more effective than random guessing—a significant improvement. Such a model has the potential to markedly enhance defensive coaches' ability to formulate game plans and adequately prepare their players, thus mitigating the opposing offense's yardage and point gains.

Keywords: lift, NFL, sports analytics, XGBoost

Procedia PDF Downloads 37
121 Particle Gradient Generation in a Microchannel Using a Single IDT

Authors: Florian Kiebert, Hagen Schmidt

Abstract:

Standing surface acoustic waves (sSAWs) have already been used to manipulate particles in a microfluidic channel made of polydimethylsiloxan (PDMS). Usually two identical facing interdigital transducers (IDTs) are exploited to form an sSAW. Further, it has been reported that an sSAW can be generated by a single IDT using a superstrate resonating cavity or a PDMS post. Nevertheless, both setups utilising a traveling surface acoustic wave (tSAW) to create an sSAW for particle manipulation are costly. We present a simplified setup with a tSAW and a PDMS channel to form an sSAW. The incident tSAW is reflected at the rear PDMS channel wall and superimposed with the reflected tSAW. This superpositioned waves generates an sSAW but only at regions where the distance to the rear channel wall is smaller as the attenuation length of the tSAW minus the channel width. Therefore in a channel of 500µm width a tSAW with a wavelength λ = 120 µm causes a sSAW over the whole channel, whereas a tSAW with λ = 60 µm only forms an sSAW next to the rear wall of the channel, taken into account the attenuation length of a tSAW in water. Hence, it is possible to concentrate and trap particles in a defined region of the channel by adjusting the relation between the channel width and tSAW wavelength. Moreover, it is possible to generate a particle gradient over the channel width by picking the right ratio between channel wall and wavelength. The particles are moved towards the rear wall by the acoustic streaming force (ASF) and the acoustic radiation force (ARF) caused by the tSAW generated bulk acoustic wave (BAW). At regions in the channel were the sSAW is dominating the ARF focuses the particles in the pressure nodes formed by the sSAW caused BAW. On the one side the ARF generated by the sSAW traps the particle at the center of the tSAW beam, i. e. of the IDT aperture. On the other side, the ASF leads to two vortices, one on the left and on the right side of the focus region, deflecting the particles out of it. Through variation of the applied power it is possible to vary the number of particles trapped in the focus points, because near to the rear wall the amplitude of the reflected tSAW is higher and, therefore, the ARF of the sSAW is stronger. So in the vicinity of the rear wall the concentration of particles is higher but decreases with increasing distance to the wall, forming a gradient of particles. The particle gradient depends on the applied power as well as on the flow rate. Thus by variation of these two parameters it is possible to change the particle gradient. Furthermore, we show that the particle gradient can be modified by changing the relation between the channel width and tSAW wavelength. Concluding a single IDT generates an sSAW in a PDMS microchannel enables particle gradient generation in a well-defined microfluidic flow system utilising the ARF and ASF of a tSAW and an sSAW.

Keywords: ARF, ASF, particle manipulation, sSAW, tSAW

Procedia PDF Downloads 308
120 Radio Frequency Identification Device Based Emergency Department Critical Care Billing: A Framework for Actionable Intelligence

Authors: Shivaram P. Arunachalam, Mustafa Y. Sir, Andy Boggust, David M. Nestler, Thomas R. Hellmich, Kalyan S. Pasupathy

Abstract:

Emergency departments (EDs) provide urgent care to patients throughout the day in a complex and chaotic environment. Real-time location systems (RTLS) are increasingly being utilized in healthcare settings, and have shown to improve safety, reduce cost, and increase patient satisfaction. Radio Frequency Identification Device (RFID) data in an ED has been shown to compute variables such as patient-provider contact time, which is associated with patient outcomes such as 30-day hospitalization. These variables can provide avenues for improving ED operational efficiency. A major challenge with ED financial operations is under-coding of critical care services due to physicians’ difficulty reporting accurate times for critical care provided under Current Procedural Terminology (CPT) codes 99291 and 99292. In this work, the authors propose a framework to optimize ED critical care billing using RFID data. RFID estimated physician-patient contact times could accurately quantify direct critical care services which will help model a data-driven approach for ED critical care billing. This paper will describe the framework and provide insights into opportunities to prevent under coding as well as over coding to avoid insurance audits. Future work will focus on data analytics to demonstrate the feasibility of the framework described.

Keywords: critical care billing, CPT codes, emergency department, RFID

Procedia PDF Downloads 106
119 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model

Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou

Abstract:

The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.

Keywords: insurance, data science, modeling, monitoring, regulation, processes

Procedia PDF Downloads 55