Search results for: image and telemetric data
24607 Visual Analytics of Higher Order Information for Trajectory Datasets
Authors: Ye Wang, Ickjai Lee
Abstract:
Due to the widespread of mobile sensing, there is a strong need to handle trails of moving objects, trajectories. This paper proposes three visual analytic approaches for higher order information of trajectory data sets based on the higher order Voronoi diagram data structure. Proposed approaches reveal geometrical information, topological, and directional information. Experimental results demonstrate the applicability and usefulness of proposed three approaches.Keywords: visual analytics, higher order information, trajectory datasets, spatio-temporal data
Procedia PDF Downloads 40424606 Self-Supervised Pretraining on Sequences of Functional Magnetic Resonance Imaging Data for Transfer Learning to Brain Decoding Tasks
Authors: Sean Paulsen, Michael Casey
Abstract:
In this work we present a self-supervised pretraining framework for transformers on functional Magnetic Resonance Imaging (fMRI) data. First, we pretrain our architecture on two self-supervised tasks simultaneously to teach the model a general understanding of the temporal and spatial dynamics of human auditory cortex during music listening. Our pretraining results are the first to suggest a synergistic effect of multitask training on fMRI data. Second, we finetune the pretrained models and train additional fresh models on a supervised fMRI classification task. We observe significantly improved accuracy on held-out runs with the finetuned models, which demonstrates the ability of our pretraining tasks to facilitate transfer learning. This work contributes to the growing body of literature on transformer architectures for pretraining and transfer learning with fMRI data, and serves as a proof of concept for our pretraining tasks and multitask pretraining on fMRI data.Keywords: transfer learning, fMRI, self-supervised, brain decoding, transformer, multitask training
Procedia PDF Downloads 9424605 Determination of Myocardial Function Using Heart Accumulated Radiopharmaceuticals
Authors: C. C .D. Kulathilake, M. Jayatilake, T. Takahashi
Abstract:
The myocardium is composed of specialized muscle which relies mainly on fatty acid and sugar metabolism and it is widely contribute to the heart functioning. The changes of the cardiac energy-producing system during heart failure have been proved using autoradiography techniques. This study focused on evaluating sugar and fatty acid metabolism in myocardium as cardiac energy getting system using heart-accumulated radiopharmaceuticals. Two sets of autoradiographs of heart cross sections of Lewis male rats were analyzed and the time- accumulation curve obtained with use of the MATLAB image processing software to evaluate fatty acid and sugar metabolic functions.Keywords: autoradiographs, fatty acid, radiopharmaceuticals, sugar
Procedia PDF Downloads 45424604 Speed up Vector Median Filtering by Quasi Euclidean Norm
Authors: Vinai K. Singh
Abstract:
For reducing impulsive noise without degrading image contours, median filtering is a powerful tool. In multiband images as for example colour images or vector fields obtained by optic flow computation, a vector median filter can be used. Vector median filters are defined on the basis of a suitable distance, the best performing distance being the Euclidean. Euclidean distance is evaluated by using the Euclidean norms which is quite demanding from the point of view of computation given that a square root is required. In this paper an optimal piece-wise linear approximation of the Euclidean norm is presented which is applied to vector median filtering.Keywords: euclidean norm, quasi euclidean norm, vector median filtering, applied mathematics
Procedia PDF Downloads 47624603 Lessons Learned from Ransomware-as-a-Service (RaaS) Organized Campaigns
Authors: Vitali Kremez
Abstract:
The researcher monitored an organized ransomware campaign in order to gain significant visibility into the tactics, techniques, and procedures employed by a campaign boss operating a ransomware scheme out of Russia. As the Russian hacking community lowered the access requirements for unsophisticated Russian cybercriminals to engage in ransomware campaigns, corporations and individuals face a commensurately greater challenge of effectively protecting their data and operations from being held ransom. This report discusses two notorious ransomware campaigns. Though the loss of data can be devastating, the findings demonstrate that sending ransom payments does not always help obtain data. Key learnings: 1. From the ransomware affiliate perspective, such campaigns have significantly lowered the barriers for entry for low-tier cybercriminals. 2. Ransomware revenue amounts are not as glamorous and fruitful as they are often publicly reported. Average ransomware crime bosses make only $90K per year on average. 3. Data gathered indicates that sending ransom payments does not always help obtain data. 4. The talk provides the complete payout structure and Bitcoin laundering operation related to the ransomware-as-a-service campaign.Keywords: bitcoin, cybercrime, ransomware, Russia
Procedia PDF Downloads 19924602 Analysis of Cross-Sectional and Retrograde Data on the Prevalence of Marginal Gingivitis
Authors: Ilma Robo, Saimir Heta, Nedja Hysi, Vera Ostreni
Abstract:
Introduction: Marginal gingivitis is a disease with considerable frequency among patients who present routinely for periodontal control and treatment. In fact, this disease may not have alarming symptoms in patients and may go unnoticed by themselves when personal hygiene conditions are optimal. The aim of this study was to collect retrograde data on the prevalence of marginal gingiva in the respective group of patients, evaluated according to specific periodontal diagnostic tools. Materials and methods: The study was conducted in two patient groups. The first group was with 34 patients, during December 2019-January 2020, and the second group was with 64 patients during 2010-2018 (each year in the mentioned monthly period). Bacterial plaque index, hemorrhage index, amount of gingival fluid, presence of xerostomia and candidiasis were recorded in patients. Results: Analysis of the collected data showed that susceptibility to marginal gingivitis shows higher values according to retrograde data, compared to cross-sectional ones. Susceptibility to candidiasis and the occurrence of xerostomia, even in the combination of both pathologies, as risk factors for the occurrence of marginal gingivitis, show higher values according to retrograde data. The female are presented with a reduced bacterial plaque index than the males, but more importantly, this index in the females is also associated with a reduced index of gingival hemorrhage, in contrast to the males. Conclusions: Cross-sectional data show that the prevalence of marginal gingivitis is more reduced, compared to retrograde data, based on the hemorrhage index and the bacterial plaque index together. Changes in production in the amount of gingival fluid show a higher prevalence of marginal gingivitis in cross-sectional data than in retrograde data; this is based on the sophistication of the way data are recorded, which evolves over time and also based on professional sensitivity to this phenomenon.Keywords: marginal gingivitis, cross-sectional, retrograde, prevalence
Procedia PDF Downloads 16824601 Why Do We Need Hierachical Linear Models?
Authors: Mustafa Aydın, Ali Murat Sunbul
Abstract:
Hierarchical or nested data structures usually are seen in many research areas. Especially, in the field of education, if we examine most of the studies, we can see the nested structures. Students in classes, classes in schools, schools in cities and cities in regions are similar nested structures. In a hierarchical structure, students being in the same class, sharing the same physical conditions and similar experiences and learning from the same teachers, they demonstrate similar behaviors between them rather than the students in other classes.Keywords: hierarchical linear modeling, nested data, hierarchical structure, data structure
Procedia PDF Downloads 65524600 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning
Authors: Yangzhi Li
Abstract:
Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.Keywords: robotic construction, robotic assembly, visual guidance, machine learning
Procedia PDF Downloads 9224599 Investigating Cloud Forensics: Challenges, Tools, and Practical Case Studies
Authors: Noha Badkook, Maryam Alsubaie, Samaher Dawood, Enas Khairullah
Abstract:
Cloud computing has introduced transformative benefits in data storage and accessibility while posing unique forensic challenges. This paper explores cloud forensics, focusing on investigating and analyzing evidence from cloud environments to address issues such as unauthorized data access, manipulation, and breaches. The research highlights the practical use of open-source forensic tools like Autopsy and Bulk Extractor in real-world scenarios, including unauthorized data sharing via Google Drive and the misuse of personal cloud storage for sensitive information leaks. This work underscores the growing importance of robust forensic procedures and accessible tools in ensuring data security and accountability in cloud ecosystems.Keywords: cloud forensic, tools, challenge, autopsy, bulk extractor
Procedia PDF Downloads 2224598 Approaches to Diagnosis of Ectopic Solid Organs in the Abdominopelvic Cavity
Authors: Van-Ngoc-Cuong Le, Ngoc-Quy Le
Abstract:
Approaches to the diagnosis of ectopic solid organs in the abdominopelvic cavity include Accessory liver lobe, Accessory spleens (ectopic splenic tissue), Wandering spleen, Ectopic pancreatic tissue, Ectopic kidney (Pancake kidney), Cryptorchidism (undescended testis, ectopic testis), Ectopic endometriosis. The application of diagnostic imaging techniques, of which magnetic resonance imaging is the most important, includes a clinical case study and reports. Ectopic organs and tumors are easy to confuse. This is a concern, as well as practical challenges encountered and solutions adopted in the fields of Image Analysis.Keywords: ectopic, accessory, wandering, tumor
Procedia PDF Downloads 1424597 The Disposable Identities; Enabling Trust-by-Design to Build Sustainable Data-Driven Value
Authors: Lorna Goulden, Kai M. Hermsen, Jari Isohanni, Mirko Ross, Jef Vanbockryck
Abstract:
This article introduces disposable identities, with reference use cases and explores possible technical approaches. The proposed approach, when fully developed as an open-source toolkit, enables developers of mobile or web apps to employ a self-sovereign identity and data privacy framework, in order to rebuild trust in digital services by providing greater transparency, decentralized control, and GDPR compliance. With a user interface for the management of self-sovereign identity, digital authorizations, and associated data-driven transactions, the advantage of Disposable Identities is that they may also contain verifiable data such as the owner’s photograph, official or even biometric identifiers for more proactive prevention of identity abuse. These Disposable Identities designed for decentralized privacy management can also be time, purpose and context-bound through a secure digital contract; with verification functionalities based on tamper-proof technology.Keywords: dentity, trust, self-sovereign, disposable identity, privacy toolkit, decentralised identity, verifiable credential, cybersecurity, data driven business, PETs, GDPRdentity, trust, self-sovereign, disposable identity, privacy toolkit, decentralised identity, verifiable credential, cybersecurity, data driven business, PETs, GDPRI
Procedia PDF Downloads 22124596 Best Practices to Enhance Patient Security and Confidentiality When Using E-Health in South Africa
Authors: Lethola Tshikose, Munyaradzi Katurura
Abstract:
Information and Communication Technology (ICT) plays a critical role in improving daily healthcare processes. The South African healthcare organizations have adopted Information Systems to integrate their patient records. This has made it much easier for healthcare organizations because patient information can now be accessible at any time. The primary purpose of this research study was to investigate the best practices that can be applied to enhance patient security and confidentiality when using e-health systems in South Africa. Security and confidentiality are critical in healthcare organizations as they ensure safety in EHRs. The research study used an inductive research approach that included a thorough literature review; therefore, no data was collected. The research paper’s scope included patient data and possible security threats associated with healthcare systems. According to the study, South African healthcare organizations discovered various patient data security and confidentiality issues. The study also revealed that when it comes to handling patient data, health professionals sometimes make mistakes. Some may not be computer literate, which posed issues and caused data to be tempered with. The research paper recommends that healthcare organizations ensure that security measures are adequately supported and promoted by their IT department. This will ensure that adequate resources are distributed to keep patient data secure and confidential. Healthcare organizations must correctly use standards set up by IT specialists to solve patient data security and confidentiality issues. Healthcare organizations must make sure that their organizational structures are adaptable to improve security and confidentiality.Keywords: E-health, EHR, security, confidentiality, healthcare
Procedia PDF Downloads 6424595 The Impacts of Internal Employees on Brand Building: A Case Study of Cell Phone
Authors: Adnan Gohar
Abstract:
This research work aims the importance of internal employees in the making of a brand (cell phone) through customer satisfaction which basically explains the connection of internal employees with external customers. This research is designed to measure the satisfaction level of internal employees which further connects to the product evolution as a brand leaving a brand image in the eye of the external customer. The main focus is that internal employees are as important as external customers for the uplift of the product resulting in the brand. Internal employees are individual organization employees, vendors, departments, and distributors.Keywords: brand building, customer satisfaction, internal employees, mobile franchise
Procedia PDF Downloads 26124594 The Effect of Data Integration to the Smart City
Authors: Richard Byrne, Emma Mulliner
Abstract:
Smart cities are a vision for the future that is increasingly becoming a reality. While a key concept of the smart city is the ability to capture, communicate, and process data that has long been produced through day-to-day activities of the city, much of the assessment models in place neglect this fact to focus on ‘smartness’ concepts. Although it is true technology often provides the opportunity to capture and communicate data in more effective ways, there are also human processes involved that are just as important. The growing importance with regards to the use and ownership of data in society can be seen by all with companies such as Facebook and Google increasingly coming under the microscope, however, why is the same scrutiny not applied to cities? The research area is therefore of great importance to the future of our cities here and now, while the findings will be of just as great importance to our children in the future. This research aims to understand the influence data is having on organisations operating throughout the smart cities sector and employs a mixed-method research approach in order to best answer the following question: Would a data-based evaluation model for smart cities be more appropriate than a smart-based model in assessing the development of the smart city? A fully comprehensive literature review concluded that there was a requirement for a data-driven assessment model for smart cities. This was followed by a documentary analysis to understand the root source of data integration to the smart city. A content analysis of city data platforms enquired as to the alternative approaches employed by cities throughout the UK and draws on best practice from New York to compare and contrast. Grounded in theory, the research findings to this point formulated a qualitative analysis framework comprised of: the changing environment influenced by data, the value of data in the smart city, the data ecosystem of the smart city and organisational response to the data orientated environment. The framework was applied to analyse primary data collected through the form of interviews with both public and private organisations operating throughout the smart cities sector. The work to date represents the first stage of data collection that will be built upon by a quantitative research investigation into the feasibility of data network effects in the smart city. An analysis into the benefits of data interoperability supporting services to the smart city in the areas of health and transport will conclude the research to achieve the aim of inductively forming a framework that can be applied to future smart city policy. To conclude, the research recognises the influence of technological perspectives in the development of smart cities to date and highlights this as a challenge to introduce theory applied with a planning dimension. The primary researcher has utilised their experience working in the public sector throughout the investigation to reflect upon what is perceived as a gap in practice of where we are today, to where we need to be tomorrow.Keywords: data, planning, policy development, smart cities
Procedia PDF Downloads 31324593 Investigation of Delivery of Triple Play Service in GE-PON Fiber to the Home Network
Authors: Anurag Sharma, Dinesh Kumar, Rahul Malhotra, Manoj Kumar
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT
Procedia PDF Downloads 74024592 Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network
Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah
Abstract:
Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.Keywords: CNN, deep-learning, facial emotion recognition, machine learning
Procedia PDF Downloads 9824591 Data and Biological Sharing Platforms in Community Health Programs: Partnership with Rural Clinical School, University of New South Wales and Public Health Foundation of India
Authors: Vivian Isaac, A. T. Joteeshwaran, Craig McLachlan
Abstract:
The University of New South Wales (UNSW) Rural Clinical School has a strategic collaborative focus on chronic disease and public health. Our objectives are to understand rural environmental and biological interactions in vulnerable community populations. The UNSW Rural Clinical School translational model is a spoke and hub network. This spoke and hub model connects rural data and biological specimens with city based collaborative public health research networks. Similar spoke and hub models are prevalent across research centers in India. The Australia-India Council grant was awarded so we could establish sustainable public health and community research collaborations. As part of the collaborative network we are developing strategies around data and biological sharing platforms between Indian Institute of Public Health, Public Health Foundation of India (PHFI), Hyderabad and Rural Clinical School UNSW. The key objective is to understand how research collaborations are conducted in India and also how data can shared and tracked with external collaborators such as ourselves. A framework to improve data sharing for research collaborations, including DNA was proposed as a project outcome. The complexities of sharing biological data has been investigated via a visit to India. A flagship sustainable project between Rural Clinical School UNSW and PHFI would illustrate a model of data sharing platforms.Keywords: data sharing, collaboration, public health research, chronic disease
Procedia PDF Downloads 45524590 Discrimination of Artificial Intelligence
Authors: Iman Abu-Rub
Abstract:
This research paper examines if Artificial Intelligence is, in fact, racist or not. Different studies from all around the world, and covering different communities were analyzed to further understand AI’s true implications over different communities. The black community, Asian community, and Muslim community were all analyzed and discussed in the paper to figure out if AI is biased or unbiased towards these specific communities. It was found that the biggest problem AI faces is the biased distribution of data collection. Most of the data inserted and coded into AI are of a white male, which significantly affects the other communities in terms of reliable cultural, political, or medical research. Nonetheless, there are various research was done that help increase awareness of this issue, but also solve it completely if done correctly. Governments and big corporations are able to implement different strategies into their AI inventions to avoid any racist results, which could cause hatred culturally but also unreliable data, medically, for example. Overall, Artificial Intelligence is not racist per se, but the data implementation and current racist culture online manipulate AI to become racist.Keywords: social media, artificial intelligence, racism, discrimination
Procedia PDF Downloads 12024589 A Neural Network Modelling Approach for Predicting Permeability from Well Logs Data
Authors: Chico Horacio Jose Sambo
Abstract:
Recently neural network has gained popularity when come to solve complex nonlinear problems. Permeability is one of fundamental reservoir characteristics system that are anisotropic distributed and non-linear manner. For this reason, permeability prediction from well log data is well suited by using neural networks and other computer-based techniques. The main goal of this paper is to predict reservoir permeability from well logs data by using neural network approach. A multi-layered perceptron trained by back propagation algorithm was used to build the predictive model. The performance of the model on net results was measured by correlation coefficient. The correlation coefficient from testing, training, validation and all data sets was evaluated. The results show that neural network was capable of reproducing permeability with accuracy in all cases, so that the calculated correlation coefficients for training, testing and validation permeability were 0.96273, 0.89991 and 0.87858, respectively. The generalization of the results to other field can be made after examining new data, and a regional study might be possible to study reservoir properties with cheap and very fast constructed models.Keywords: neural network, permeability, multilayer perceptron, well log
Procedia PDF Downloads 40824588 Frequent Itemset Mining Using Rough-Sets
Authors: Usman Qamar, Younus Javed
Abstract:
Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and rough-sets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.Keywords: rough-sets, classification, feature selection, entropy, outliers, frequent itemset mining
Procedia PDF Downloads 44124587 Application of Regularized Spatio-Temporal Models to the Analysis of Remote Sensing Data
Authors: Salihah Alghamdi, Surajit Ray
Abstract:
Space-time data can be observed over irregularly shaped manifolds, which might have complex boundaries or interior gaps. Most of the existing methods do not consider the shape of the data, and as a result, it is difficult to model irregularly shaped data accommodating the complex domain. We used a method that can deal with space-time data that are distributed over non-planner shaped regions. The method is based on partial differential equations and finite element analysis. The model can be estimated using a penalized least squares approach with a regularization term that controls the over-fitting. The model is regularized using two roughness penalties, which consider the spatial and temporal regularities separately. The integrated square of the second derivative of the basis function is used as temporal penalty. While the spatial penalty consists of the integrated square of Laplace operator, which is integrated exclusively over the domain of interest that is determined using finite element technique. In this paper, we applied a spatio-temporal regression model with partial differential equations regularization (ST-PDE) approach to analyze a remote sensing data measuring the greenness of vegetation, measure by an index called enhanced vegetation index (EVI). The EVI data consist of measurements that take values between -1 and 1 reflecting the level of greenness of some region over a period of time. We applied (ST-PDE) approach to irregular shaped region of the EVI data. The approach efficiently accommodates the irregular shaped regions taking into account the complex boundaries rather than smoothing across the boundaries. Furthermore, the approach succeeds in capturing the temporal variation in the data.Keywords: irregularly shaped domain, partial differential equations, finite element analysis, complex boundray
Procedia PDF Downloads 14424586 Utilising an Online Data Collection Platform for the Development of a Community Engagement Database: A Case Study on Building Inter-Institutional Partnerships at UWC
Authors: P. Daniels, T. Adonis, P. September-Brown, R. Comalie
Abstract:
The community engagement unit at the University of the Western Cape was tasked with establishing a community engagement database. The database would store information of all community engagement projects related to the university. The wealth of knowledge obtained from the various disciplines would be used to facilitate interdisciplinary collaboration within the university, as well as facilitating community university partnership opportunities. The purpose of this qualitative study was to explore electronic data collection through the development of a database. Two types of electronic data collection platforms were used, namely online questionnaire and email. The semi structured questionnaire was used to collect data related to community engagement projects from different faculties and departments at the university. There are many benefits for using an electronic data collection platform, such as reduction of costs and time, ease in reaching large numbers of potential respondents, and the possibility of providing anonymity to participants. Despite all the advantages of using the electronic platform, there were as many challenges, as depicted in our findings. The findings suggest that certain barriers existed by using an electronic platform for data collection, even though it was in an academic environment, where knowledge and resources were in abundance. One of the challenges experienced in this process was the lack of dissemination of information via email to staff within faculties. The actual online software used for the questionnaire had its own limitations, such as only being able to access the questionnaire from the same electronic device. In a few cases, academics only completed the questionnaire after a telephonic prompt or face to face meeting about "Is higher education in South Africa ready to embrace electronic platform in data collection?"Keywords: community engagement, database, data collection, electronic platform, electronic tools, knowledge sharing, university
Procedia PDF Downloads 26724585 Women Entrepreneurial Resiliency Amidst COVID-19
Authors: Divya Juneja, Sukhjeet Kaur Matharu
Abstract:
Purpose: The paper is aimed at identifying the challenging factors experienced by the women entrepreneurs in India in operating their enterprises amidst the challenges posed by the COVID-19 pandemic. Methodology: The sample for the study comprised 396 women entrepreneurs from different regions of India. A purposive sampling technique was adopted for data collection. Data was collected through a self-administered questionnaire. Analysis was performed using the SPSS package for quantitative data analysis. Findings: The results of the study state that entrepreneurial characteristics, resourcefulness, networking, adaptability, and continuity have a positive influence on the resiliency of women entrepreneurs when faced with a crisis situation. Practical Implications: The findings of the study have some important implications for women entrepreneurs, organizations, government, and other institutions extending support to entrepreneurs.Keywords: women entrepreneurs, analysis, data analysis, positive influence, resiliency
Procedia PDF Downloads 11824584 Partial Least Square Regression for High-Dimentional and High-Correlated Data
Authors: Mohammed Abdullah Alshahrani
Abstract:
The research focuses on investigating the use of partial least squares (PLS) methodology for addressing challenges associated with high-dimensional correlated data. Recent technological advancements have led to experiments producing data characterized by a large number of variables compared to observations, with substantial inter-variable correlations. Such data patterns are common in chemometrics, where near-infrared (NIR) spectrometer calibrations record chemical absorbance levels across hundreds of wavelengths, and in genomics, where thousands of genomic regions' copy number alterations (CNA) are recorded from cancer patients. PLS serves as a widely used method for analyzing high-dimensional data, functioning as a regression tool in chemometrics and a classification method in genomics. It handles data complexity by creating latent variables (components) from original variables. However, applying PLS can present challenges. The study investigates key areas to address these challenges, including unifying interpretations across three main PLS algorithms and exploring unusual negative shrinkage factors encountered during model fitting. The research presents an alternative approach to addressing the interpretation challenge of predictor weights associated with PLS. Sparse estimation of predictor weights is employed using a penalty function combining a lasso penalty for sparsity and a Cauchy distribution-based penalty to account for variable dependencies. The results demonstrate sparse and grouped weight estimates, aiding interpretation and prediction tasks in genomic data analysis. High-dimensional data scenarios, where predictors outnumber observations, are common in regression analysis applications. Ordinary least squares regression (OLS), the standard method, performs inadequately with high-dimensional and highly correlated data. Copy number alterations (CNA) in key genes have been linked to disease phenotypes, highlighting the importance of accurate classification of gene expression data in bioinformatics and biology using regularized methods like PLS for regression and classification.Keywords: partial least square regression, genetics data, negative filter factors, high dimensional data, high correlated data
Procedia PDF Downloads 5524583 The Use of Voice in Online Public Access Catalog as Faster Searching Device
Authors: Maisyatus Suadaa Irfana, Nove Eka Variant Anna, Dyah Puspitasari Sri Rahayu
Abstract:
Technological developments provide convenience to all the people. Nowadays, the communication of human with the computer is done via text. With the development of technology, human and computer communications have been conducted with a voice like communication between human beings. It provides an easy facility for many people, especially those who have special needs. Voice search technology is applied in the search of book collections in the OPAC (Online Public Access Catalog), so library visitors will find it faster and easier to find books that they need. Integration with Google is needed to convert the voice into text. To optimize the time and the results of searching, Server will download all the book data that is available in the server database. Then, the data will be converted into JSON format. In addition, the incorporation of some algorithms is conducted including Decomposition (parse) in the form of array of JSON format, the index making, analyzer to the result. It aims to make the process of searching much faster than the usual searching in OPAC because the data are directly taken to the database for every search warrant. Data Update Menu is provided with the purpose to enable users perform their own data updates and get the latest data information.Keywords: OPAC, voice, searching, faster
Procedia PDF Downloads 35024582 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data
Authors: Chen Chou, Feng-Tyan Lin
Abstract:
Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.Keywords: Big Data, ITS, influence range, living area, central place theory, visualization
Procedia PDF Downloads 28024581 Performance Analysis of Hierarchical Agglomerative Clustering in a Wireless Sensor Network Using Quantitative Data
Authors: Tapan Jain, Davender Singh Saini
Abstract:
Clustering is a useful mechanism in wireless sensor networks which helps to cope with scalability and data transmission problems. The basic aim of our research work is to provide efficient clustering using Hierarchical agglomerative clustering (HAC). If the distance between the sensing nodes is calculated using their location then it’s quantitative HAC. This paper compares the various agglomerative clustering techniques applied in a wireless sensor network using the quantitative data. The simulations are done in MATLAB and the comparisons are made between the different protocols using dendrograms.Keywords: routing, hierarchical clustering, agglomerative, quantitative, wireless sensor network
Procedia PDF Downloads 62424580 Intangible Cultural Heritage as a Strategic Place Branding Tool
Authors: L. Ozoliņa
Abstract:
Place branding as a strategic marketing tool is applied in Latvia since 2000. The main objective of the study is to find unique connecting aspects of the intangible cultural heritage elements on the development of sustainable place branding. The study is based on in-depth semi-structured interviews with Latvian place branding experts and content analysis of Latvia's place brand identities. The study indicates place branding as an internal co-creational and educational process of all involved stakeholders of the place and highlights a critical view on the local place branding practices on the notability of the in-depth research of the intangible cultural heritage.Keywords: belonging, identity, intangible cultural heritage, narrative, self-image, place branding
Procedia PDF Downloads 14924579 Qualitative Data Analysis for Health Care Services
Authors: Taner Ersoz, Filiz Ersoz
Abstract:
This study was designed enable application of multivariate technique in the interpretation of categorical data for measuring health care services satisfaction in Turkey. The data was collected from a total of 17726 respondents. The establishment of the sample group and collection of the data were carried out by a joint team from The Ministry of Health and Turkish Statistical Institute (Turk Stat) of Turkey. The multiple correspondence analysis (MCA) was used on the data of 2882 respondents who answered the questionnaire in full. The multiple correspondence analysis indicated that, in the evaluation of health services females, public employees, younger and more highly educated individuals were more concerned and complainant than males, private sector employees, older and less educated individuals. Overall 53 % of the respondents were pleased with the improvements in health care services in the past three years. This study demonstrates the public consciousness in health services and health care satisfaction in Turkey. It was found that most the respondents were pleased with the improvements in health care services over the past three years. Awareness of health service quality increases with education levels. Older individuals and males would appear to have lower expectancies in health services.Keywords: multiple correspondence analysis, multivariate categorical data, health care services, health satisfaction survey
Procedia PDF Downloads 24924578 An Extraction of Cancer Region from MR Images Using Fuzzy Clustering Means and Morphological Operations
Authors: Ramandeep Kaur, Gurjit Singh Bhathal
Abstract:
Cancer diagnosis is very difficult task. Magnetic resonance imaging (MRI) scan is used to produce image of any part of the body and provides an efficient way for diagnosis of cancer or tumor. In existing method, fuzzy clustering mean (FCM) is used for the diagnosis of the tumor. In the proposed method FCM is used to diagnose the cancer of the foot. FCM finds the centroids of the clusters of the foot cancer obtained from MRI images. FCM thresholding result shows the extract region of the cancer. Morphological operations are applied to get extracted region of cancer.Keywords: magnetic resonance imaging (MRI), fuzzy C mean clustering, segmentation, morphological operations
Procedia PDF Downloads 404