Search results for: data mining techniques
27352 An Abbattoir-Based Study on Relative Prevalence of Histopathologic Patterns of Hepatic Lesions in One-Humped Camels (Camelus deromedarius), Semnan, Iran
Authors: Keivan Jamshidi, Afshin Zahedi
Abstract:
An abattoir based study was carried out during spring 2011 to investigate pathological conditions of the liver in camels (Camelus deromedarius) slaughtered in the Semnan slaughter house, Northern East of Iran. In this study, 40 carcasses out of 150 randomly selected carcasses inspected at postmortem, found with liver lesions. Proper tissue samples obtained from the livers with macroscopic lesions, fixed in 10% neutral buffer formaldehyde, processed for routine histopathological techniques, and finally embedded in paraffin blocks. Sections of 5µm thickness then cut and stained by H&E staining techniques. In histopathological examination of hepatic tissues, following changes were observed: Hydatid cysts; 65%, Cirrhosis; 10%, Hepatic lipidosis (Mild to Severe fatty changes); 12.5%, Glycogen deposition; 2.5%, Cholangitis; 2.8%, Cholangiohepatitis; 5%, Calcified hydatid cyst; 2.5%, Hepatic abscess; 2.5%, lipofuscin pigments; 17.5%. It is concluded that the highest and lowest prevalent patterns of hepatic lesions were hydatid cysts and Hepatic abscess respectively.Keywords: camel, liver, lesion, pathology, slaughterhouse
Procedia PDF Downloads 47827351 Frequent Pattern Mining for Digenic Human Traits
Authors: Atsuko Okazaki, Jurg Ott
Abstract:
Some genetic diseases (‘digenic traits’) are due to the interaction between two DNA variants. For example, certain forms of Retinitis Pigmentosa (a genetic form of blindness) occur in the presence of two mutant variants, one in the ROM1 gene and one in the RDS gene, while the occurrence of only one of these mutant variants leads to a completely normal phenotype. Detecting such digenic traits by genetic methods is difficult. A common approach to finding disease-causing variants is to compare 100,000s of variants between individuals with a trait (cases) and those without the trait (controls). Such genome-wide association studies (GWASs) have been very successful but hinge on genetic effects of single variants, that is, there should be a difference in allele or genotype frequencies between cases and controls at a disease-causing variant. Frequent pattern mining (FPM) methods offer an avenue at detecting digenic traits even in the absence of single-variant effects. The idea is to enumerate pairs of genotypes (genotype patterns) with each of the two genotypes originating from different variants that may be located at very different genomic positions. What is needed is for genotype patterns to be significantly more common in cases than in controls. Let Y = 2 refer to cases and Y = 1 to controls, with X denoting a specific genotype pattern. We are seeking association rules, ‘X → Y’, with high confidence, P(Y = 2|X), significantly higher than the proportion of cases, P(Y = 2) in the study. Clearly, generally available FPM methods are very suitable for detecting disease-associated genotype patterns. We use fpgrowth as the basic FPM algorithm and built a framework around it to enumerate high-frequency digenic genotype patterns and to evaluate their statistical significance by permutation analysis. Application to a published dataset on opioid dependence furnished results that could not be found with classical GWAS methodology. There were 143 cases and 153 healthy controls, each genotyped for 82 variants in eight genes of the opioid system. The aim was to find out whether any of these variants were disease-associated. The single-variant analysis did not lead to significant results. Application of our FPM implementation resulted in one significant (p < 0.01) genotype pattern with both genotypes in the pattern being heterozygous and originating from two variants on different chromosomes. This pattern occurred in 14 cases and none of the controls. Thus, the pattern seems quite specific to this form of substance abuse and is also rather predictive of disease. An algorithm called Multifactor Dimension Reduction (MDR) was developed some 20 years ago and has been in use in human genetics ever since. This and our algorithms share some similar properties, but they are also very different in other respects. The main difference seems to be that our algorithm focuses on patterns of genotypes while the main object of inference in MDR is the 3 × 3 table of genotypes at two variants.Keywords: digenic traits, DNA variants, epistasis, statistical genetics
Procedia PDF Downloads 12227350 Heritage and Tourism in the Era of Big Data: Analysis of Chinese Cultural Tourism in Catalonia
Authors: Xinge Liao, Francesc Xavier Roige Ventura, Dolores Sanchez Aguilera
Abstract:
With the development of the Internet, the study of tourism behavior has rapidly expanded from the traditional physical market to the online market. Data on the Internet is characterized by dynamic changes, and new data appear all the time. In recent years the generation of a large volume of data was characterized, such as forums, blogs, and other sources, which have expanded over time and space, together they constitute large-scale Internet data, known as Big Data. This data of technological origin that derives from the use of devices and the activity of multiple users is becoming a source of great importance for the study of geography and the behavior of tourists. The study will focus on cultural heritage tourist practices in the context of Big Data. The research will focus on exploring the characteristics and behavior of Chinese tourists in relation to the cultural heritage of Catalonia. Geographical information, target image, perceptions in user-generated content will be studied through data analysis from Weibo -the largest social networks of blogs in China. Through the analysis of the behavior of heritage tourists in the Big Data environment, this study will understand the practices (activities, motivations, perceptions) of cultural tourists and then understand the needs and preferences of tourists in order to better guide the sustainable development of tourism in heritage sites.Keywords: Barcelona, Big Data, Catalonia, cultural heritage, Chinese tourism market, tourists’ behavior
Procedia PDF Downloads 13827349 Acquisition and Preservation of Traditional Medicinal Knowledge in Rural Areas of KwaZulu Natal, South Africa
Authors: N. Khanyile, P. Dlamini, M. Masenya
Abstract:
Background: Most of the population in Africa is still dependent on indigenous medicinal knowledge for treating and managing ailments. Indigenous traditional knowledge owners/practitioners who own this knowledge are consulted by communities, but their knowledge is not known how they get it. The question of how knowledge is acquired and preserved remains one of the biggest challenges in traditional healing and treatment with herbal medicines. It is regrettable that despite the importance and recognition of indigenous medicinal knowledge globally, the details of acquirement, storing and transmission, and preservation techniques are not known. Hence this study intends to unveil the process of acquirement and transmission, and preservation techniques of indigenous medical knowledge by its owners. Objectives: This study aims to assess the process of acquiring and preservation of traditional medicinal knowledge by traditional medicinal knowledge owners/practitioners in uMhlathuze Municipality, in the province of KwaZulu-Natal, South Africa. The study was guided by four research objectives which were to: identify the types of traditional medicinal knowledge owners who possess this knowledge, establish the approach used by indigenous medicinal knowledge owners/healers for acquiring medicinal knowledge, identify the process of preservation of medicinal knowledge by indigenous medicinal knowledge owners/healers, and determine the challenges encountered in transferring the knowledge. Method: The study adopted a qualitative research approach, and a snowball sampling technique was used to identify the study population. Data was collected through semi-structured interviews with indigenous medicinal knowledge owners. Results: The findings suggested that uMhlathuze municipality had different types of indigenous medicinal knowledge owners who possess valuable knowledge. These are diviners (Izangoma), faith healers (Abathandazi), and herbalists (Izinyanga). The study demonstrated that indigenous medicinal knowledge is acquired in many different ways, including visions, dreams, and vigorous training. The study also revealed the acquired knowledge is preserved or shared with specially chosen children and trainees. Conclusion: The study concluded that this knowledge is gotten through vigorous training, which requires the learner to be attentive and eager to learn. It was recommended that a study of this nature be conducted but at a broader level to enhance an informed conclusion and recommendations.Keywords: preserving, indigenous medicinal knowledge, indigenous knowledge, indigenous medicinal knowledge owners/practitioners, acquiring
Procedia PDF Downloads 8727348 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption
Authors: Darusalam, Jorish Hulstijn, Marijn Janssen
Abstract:
Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.Keywords: open data, accountability, anti-corruption, framework
Procedia PDF Downloads 33627347 Fiber Orientation Measurements in Reinforced Thermoplastics
Authors: Ihsane Modhaffar
Abstract:
Fiber orientation is essential for the physical properties of composite materials. The theoretical parameters of a given reinforcement are usually known and widely used to predict the behavior of the material. In this work, we propose an image processing approach to estimate true principal directions and fiber orientation during injection molding processes of short fiber reinforced thermoplastics. Generally, a group of fibers are described in terms of probability distribution function or orientation tensor. Numerical techniques for the prediction of fiber orientation are also considered for concentrated situations. The flow was considered to be incompressible, and behave as Newtonian fluid containing suspensions of short-fibers. The governing equations, of this problem are: the continuity, the momentum and the energy. The obtained results were compared to available experimental findings. A good agreement between the numerical results and the experimental data was achieved.Keywords: injection, composites, short-fiber reinforced thermoplastics, fiber orientation, incompressible fluid, numerical simulation
Procedia PDF Downloads 53227346 Netnography Research in Leisure, Tourism, and Hospitality: Lessons from Research and Education
Authors: Marisa P. De Brito
Abstract:
The internet is affecting the way the industry operates and communicates. It is also becoming a customary means for leisure, tourism, and hospitality consumers to seek and exchange information and views on hotels, destinations events and attractions, or to develop social ties with other users. On the one hand, the internet is a rich field to conduct leisure, tourism, and hospitality research; on the other hand, however, there are few researchers formally embracing online methods of research, such as netnography. Within social sciences, netnography falls under the interpretative/ethnographic research methods umbrella. It is an adaptation of anthropological techniques such as participant and non-participant observation, used to study online interactions happening on social media platforms, such as Facebook. It is, therefore, a research method applied to the study of online communities, being the term itself a contraction of the words network (as on internet), and ethnography. It was developed in the context of marketing research in the nineties, and in the last twenty years, it has spread to other contexts such as education, psychology, or urban studies. Since netnography is not universally known, it may discourage researchers and educators from using it. This work offers guidelines for researchers wanting to apply this method in the field of leisure, tourism, and hospitality or for educators wanting to teach about it. This is done by means of a double approach: a content analysis of the literature side-by-side with educational data, on the use of netnography. The content analysis is of the incidental research using netnography in leisure, tourism, and hospitality in the last twenty years. The educational data is the author and her colleagues’ experience in coaching students throughout the process of writing a paper using primary netnographic data - from identifying the phenomenon to be studied, selecting an online community, collecting and analyzing data to writing their findings. In the end, this work puts forward, on the one hand, a research agenda, and on the other hand, an educational roadmap for those wanting to apply netnography in the field or the classroom. The educator’s roadmap will summarise what can be expected from mini-netnographies conducted by students and how to set it up. The research agenda will highlight for which issues and research questions the method is most suitable; what are the most common bottlenecks and drawbacks of the method and of its application, but also where most knowledge opportunities lay.Keywords: netnography, online research, research agenda, educator's roadmap
Procedia PDF Downloads 18327345 Concurrent Engineering Challenges and Resolution Mechanisms from Quality Perspectives
Authors: Grmanesh Gidey Kahsay
Abstract:
In modern technical engineering applications, quality is defined in two ways. The first one is that quality is the parameter that measures a product or service’s characteristics to meet and satisfy the pre-stated or fundamental needs (reliability, durability, serviceability). The second one is the quality of a product or service free of any defect or deficiencies. The American Society for Quality (ASQ) describes quality as a pursuit of optimal solutions to confirm successes and fulfillment to be accountable for the product or service's requirements and expectations. This article focuses on quality engineering tools in modern industrial applications. Quality engineering is a field of engineering that deals with the principles, techniques, models, and applications of the product or service to guarantee quality. Including the entire activities to analyze the product’s design and development, quality engineering emphasizes how to make sure that products and services are designed and developed to meet consumers’ requirements. This episode acquaints with quality tools such as quality systems, auditing, product design, and process control. The finding presents thoughts that aim to improve quality engineering proficiency and effectiveness by introducing essential quality techniques and tools in some selected industries.Keywords: essential quality tools, quality systems and models, quality management systems, and quality assurance
Procedia PDF Downloads 15227344 Performance Analysis and Comparison of Various 1-D and 2-D Prime Codes for OCDMA Systems
Authors: Gurjit Kaur, Shashank Johri, Arpit Mehrotra
Abstract:
In this paper we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension i.e. time slots whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for OCDMA system on a single platform. Results shows that 1D Extended Prime Code (EPC) can support more number of active users compared to other codes but at the expense of larger code length which further increases the complexity of the code. Modified Prime Code (MPC) supports lesser number of active users at λc=2 but it has a lesser code length as compared to 1D prime code. Analysis shows that 2D prime code supports lesser number of active users than 1D codes but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa
Procedia PDF Downloads 51027343 Status of Bio-Graphene Extraction from Biomass: A Review
Authors: Simon Peter Wafula, Ziporah Nakabazzi Kitooke
Abstract:
Graphene is a carbon allotrope made of a two-dimensional shape. This material has got a number of materials researchers’ interest due to its properties that are special compared to ordinary material. Graphene is thought to enhance a number of material properties in the manufacturing, energy, and construction industries. Many studies consider graphene to be a wonder material, just like plastic in the 21st century. This shows how much should be invested in graphene research. This review highlights the status of graphene extracted from various biomass sources together with their appropriate extraction techniques, including the pretreatment methods for a better product. The functional groups and structure of graphene extracted using several common methods of synthesis are in this paper as well. The review explores methods like chemical vapor deposition (CVD), hydrothermal, chemical exfoliation method, liquid exfoliation, and Hummers. Comparative analysis of the various extraction techniques gives an insight into each of their advantages, challenges, and potential scalability. The review also highlights the pretreatment process for biomass before carbonation for better quality of bio-graphene. The various graphene modes, as well as their applications, are in this study. Recommendations for future research for improving the efficiency and sustainability of bio-graphene are highlighted.Keywords: exfoliation, nanomaterials, biochar, large-scale, two-dimension
Procedia PDF Downloads 4927342 Causes and Consequences of Intuitive Animal Communication: A Case Study at Panthera Africa
Authors: Cathrine Scharning Cornwall-Nyquist, David Rafael Vaz Fernandes
Abstract:
Since its origins, mankind has been dreaming of communicating directly with other animals. Past civilizations interacted on different levels with other species and recognized them in their rituals and daily activities. However, recent scientific developments have limited the ability of humans to consider deeper levels of interaction beyond observation and/or physical behavior. In recent years, animal caretakers and facilities such as sanctuaries or rescue centers have been introducing new techniques based on intuition. Most of those initiatives are related to specific cases, such as the incapacity to understand an animal’s behavior. Respected organizations also include intuitive animal communication (IAC) sessions to follow up on past interventions with their animals. Despite the lack of credibility of this discipline, some animal caring structures have opted to integrate IAC into their daily routines and approaches to animal welfare. At this stage, animal communication will be generally defined as the ability of humans to communicate with animals on an intuitive level. The trend in the field remains to be explored. The lack of theory and previous research urges the scientific community to improve the description of the phenomenon and its consequences. Considering the current scenario, qualitative approaches may become a suitable pathway to explore this topic. The purpose of this case study is to explore the beliefs behind and the consequences of an approach based on intuitive animal communication techniques for Panthera Africa (PA), an ethical sanctuary located in South Africa. Due to their personal experience, the Sanctuary’s founders have developed a philosophy based on IAC while respecting the world's highest standards for big cat welfare. Their dual approach is reflected in their rescues, daily activities, and healing animals’ trauma. The case study's main research questions will be: (i) Why do they choose to apply IAC in their work? (ii) What consequences to their activities do IAC bring? (iii) What effects do IAC techniques bring in their interactions with the outside world? Data collection will be gathered on-site via: (i) Complete participation (field notes); (ii) Semi-structured interviews (audio transcriptions); (iii) Document analysis (internal procedures and policies); (iv) Audio-visual material (communication with third parties). The main researcher shall become an active member of the Sanctuary during a 30-day period and have full access to the site. Access to documents and audio-visual materials will be granted on a request basis. Interviews are expected to be held with PA founders and staff members and with IAC practitioners related to the facility. The information gathered shall enable the researcher to provide an extended description of the phenomenon and explore its internal and external consequences for Panthera Africa.Keywords: animal welfare, intuitive animal communication, Panthera Africa, rescue
Procedia PDF Downloads 9227341 A Phishing Email Detection Approach Using Machine Learning Techniques
Authors: Kenneth Fon Mbah, Arash Habibi Lashkari, Ali A. Ghorbani
Abstract:
Phishing e-mails are a security issue that not only annoys online users, but has also resulted in significant financial losses for businesses. Phishing advertisements and pornographic e-mails are difficult to detect as attackers have been becoming increasingly intelligent and professional. Attackers track users and adjust their attacks based on users’ attractions and hot topics that can be extracted from community news and journals. This research focuses on deceptive Phishing attacks and their variants such as attacks through advertisements and pornographic e-mails. We propose a framework called Phishing Alerting System (PHAS) to accurately classify e-mails as Phishing, advertisements or as pornographic. PHAS has the ability to detect and alert users for all types of deceptive e-mails to help users in decision making. A well-known email dataset has been used for these experiments and based on previously extracted features, 93.11% detection accuracy is obtainable by using J48 and KNN machine learning techniques. Our proposed framework achieved approximately the same accuracy as the benchmark while using this dataset.Keywords: phishing e-mail, phishing detection, anti phishing, alarm system, machine learning
Procedia PDF Downloads 34127340 Gradient Length Anomaly Analysis for Landslide Vulnerability Analysis of Upper Alaknanda River Basin, Uttarakhand Himalayas, India
Authors: Hasmithaa Neha, Atul Kumar Patidar, Girish Ch Kothyari
Abstract:
The northward convergence of the Indian plate has a dominating influence over the structural and geomorphic development of the Himalayan region. The highly deformed and complex stratigraphy in the area arises from a confluence of exogenic and endogenetic geological processes. This region frequently experiences natural hazards such as debris flows, flash floods, avalanches, landslides, and earthquakes due to its harsh and steep topography and fragile rock formations. Therefore, remote sensing technique-based examination and real-time monitoring of tectonically sensitive regions may provide crucial early warnings and invaluable data for effective hazard mitigation strategies. In order to identify unusual changes in the river gradients, the current study demonstrates a spatial quantitative geomorphic analysis of the upper Alaknanda River basin, Uttarakhand Himalaya, India, using gradient length anomaly analysis (GLAA). This basin is highly vulnerable to ground creeping and landslides due to the presence of active faults/thrusts, toe-cutting of slopes for road widening, development of heavy engineering projects on the highly sheared bedrock, and periodic earthquakes. The intersecting joint sets developed in the bedrocks have formed wedges that have facilitated the recurrence of several landslides. The main objective of current research is to identify abnormal gradient lengths, indicating potential landslide-prone zones. High-resolution digital elevation data and geospatial techniques are used to perform this analysis. The results of GLAA are corroborated with the historical landslide events and ultimately used for the generation of landslide susceptibility maps of the current study area. The preliminary results indicate that approximately 3.97% of the basin is stable, while about 8.54% is classified as moderately stable and suitable for human habitation. However, roughly 19.89% fall within the zone of moderate vulnerability, 38.06% are classified as vulnerable, and 29% fall within the highly vulnerable zones, posing risks for geohazards, including landslides, glacial avalanches, and earthquakes. This research provides valuable insights into the spatial distribution of landslide-prone areas. It offers a basis for implementing proactive measures for landslide risk reduction, including land-use planning, early warning systems, and infrastructure development techniques.Keywords: landslide vulnerability, geohazard, GLA, upper Alaknanda Basin, Uttarakhand Himalaya
Procedia PDF Downloads 7227339 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 31427338 NFC Communications with Mutual Authentication Based on Limited-Use Session Keys
Authors: Chalee Thammarat
Abstract:
Mobile phones are equipped with increased short-range communication functionality called Near Field Communication (or NFC for short). NFC needs no pairing between devices but suitable for little amounts of data in a very restricted area. A number of researchers presented authentication techniques for NFC communications, however, they still lack necessary authentication, particularly mutual authentication and security qualifications. This paper suggests a new authentication protocol for NFC communication that gives mutual authentication between devices. The mutual authentication is a one of property, of security that protects replay and man-in-the-middle (MitM) attack. The proposed protocols deploy a limited-use offline session key generation and use of distribution technique to increase security and make our protocol lightweight. There are four sub-protocols: NFCAuthv1 is suitable for identification and access control and NFCAuthv2 is suitable for the NFC-enhanced phone by a POS terminal for digital and physical goods and services.Keywords: cryptographic protocols, NFC, near field communications, security protocols, mutual authentication, network security
Procedia PDF Downloads 43027337 Times2D: A Time-Frequency Method for Time Series Forecasting
Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan
Abstract:
Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation
Procedia PDF Downloads 4227336 Consumer Protection Law For Users Mobile Commerce as a Global Effort to Improve Business in Indonesia
Authors: Rina Arum Prastyanti
Abstract:
Information technology has changed the ways of transacting and enabling new opportunities in business transactions. Problems to be faced by consumers M Commerce, among others, the consumer will have difficulty accessing the full information about the products on offer and the forms of transactions given the small screen and limited storage capacity, the need to protect children from various forms of excess supply and usage as well as errors in access and disseminate personal data, not to mention the more complex problems as well as problems agreements, dispute resolution that can protect consumers and assurance of security of personal data. It is no less important is the risk of payment and personal information of payment dal am also an important issue that should be on the swatch solution. The purpose of this study is 1) to describe the phenomenon of the use of Mobile Commerce in Indonesia. 2) To determine the form of legal protection for the consumer use of Mobile Commerce. 3) To get the right type of law so as to provide legal protection for consumers Mobile Commerce users. This research is a descriptive qualitative research. Primary and secondary data sources. This research is a normative law. Engineering conducted engineering research library collection or library research. The analysis technique used is deductive analysis techniques. Growing mobile technology and more affordable prices as well as low rates of provider competition also affects the increasing number of mobile users, Indonesia is placed into 4 HP users in the world, the number of mobile phones in Indonesia is estimated at around 250.1 million telephones with a population of 237 556. 363. Indonesian form of legal protection in the use of mobile commerce still a part of the Law No. 11 of 2008 on Information and Electronic Transactions and until now there is no rule of law that specifically regulates mobile commerce. Legal protection model that can be applied to protect consumers of mobile commerce users ensuring that consumers get information about potential security and privacy challenges they may face in m commerce and measures that can be used to limit the risk. Encourage the development of security measures and built security features. To encourage mobile operators to implement data security policies and measures to prevent unauthorized transactions. Provide appropriate methods both time and effectiveness of redress when consumers suffer financial loss.Keywords: mobile commerce, legal protection, consumer, effectiveness
Procedia PDF Downloads 36427335 Data-Centric Anomaly Detection with Diffusion Models
Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu
Abstract:
Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.Keywords: diffusion models, anomaly detection, data-centric, generative AI
Procedia PDF Downloads 8227334 Critical Behaviour and Filed Dependence of Magnetic Entropy Change in K Doped Manganites Pr₀.₈Na₀.₂−ₓKₓMnO₃ (X = .10 And .15)
Authors: H. Ben Khlifa, W. Cheikhrouhou-Koubaa, A. Cheikhrouhou
Abstract:
The orthorhombic Pr₀.₈Na₀.₂−ₓKₓMnO₃ (x = 0.10 and 0.15) manganites are prepared by using the solid-state reaction at high temperatures. The critical exponents (β, γ, δ) are investigated through various techniques such as modified Arrott plot, Kouvel-Fisher method, and critical isotherm analysis based on the data of the magnetic measurements recorded around the Curie temperature. The critical exponents are derived from the magnetization data using the Kouvel-Fisher method, are found to be β = 0.32(4) and γ = 1.29(2) at TC ~ 123 K for x = 0.10 and β = 0.31(1) and γ = 1.25(2) at TC ~ 133 K for x = 0.15. The critical exponent values obtained for both samples are comparable to the values predicted by the 3D-Ising model and have also been verified by the scaling equation of state. Such results demonstrate the existence of ferromagnetic short-range order in our materials. The magnetic entropy changes of polycrystalline samples with a second-order phase transition are investigated. A large magnetic entropy change deduced from isothermal magnetization curves, is observed in our samples with a peak centered on their respective Curie temperatures (TC). The field dependence of the magnetic entropy changes are analyzed, which shows power-law dependence ΔSmax ≈ a(μ0 H)n at the transition temperature. The values of n obey the Curie Weiss law above the transition temperature. It is shown that for the investigated materials, the magnetic entropy change follows a master curve behavior. The rescaled magnetic entropy change curves for different applied fields collapse onto a single curve for both samples.Keywords: manganites, critical exponents, magnetization, magnetocaloric, master curve
Procedia PDF Downloads 16427333 Reading Strategy Instruction in Secondary Schools in China
Authors: Leijun Zhang
Abstract:
Reading literacy has become a powerful tool for academic success and an essential goal of education. The ability to read is not only fundamental for pupils’ academic success but also a prerequisite for successful participation in today’s vastly expanding multi-literate textual environment. It is also important to recognize that, in many educational settings, students are expected to learn a foreign/second language for successful participation in the increasingly globalized world. Therefore, it is crucial to help learners become skilled foreign-language readers. Research indicates that students’ reading comprehension can be significantly improved through explicit instruction of multiple reading strategies. Despite the wealth of research on how to enhance learners’ reading comprehension achievement by identifying an enormous range of reading strategies and techniques for assisting students in comprehending specific texts, relatively scattered studies have centered on whether these reading comprehension strategies and techniques are used in classrooms, especially in Chinese academic settings. Given the central role of ‘the teacher’ in reading instruction, the study investigates the degree of importance that EFL teachers attach to reading comprehension strategies and their classroom employment of those strategies in secondary schools in China. It also explores the efficiency of reading strategy instruction on pupils’ reading comprehension performance. As a mix-method study, the analysis drew on data from a quantitative survey and interviews with seven teachers. The study revealed that the EFL teachers had positive attitudes toward the use of cognitive strategies despite their insufficient knowledge about and limited attention to the metacognitive strategies and supporting strategies. Regarding the selection of reading strategies for instruction, the mandated curriculum and high-stakes examinations, text features and demands, teaching preparation programs and their own EFL reading experiences were the major criteria in their responses, while few teachers took into account the learner needs in their choice of reading strategies. Although many teachers agreed upon the efficiency of reading strategy instruction in developing students’ reading comprehension competence, three challenges were identified in their implementation of the strategy instruction. The study provides some insights into reading strategy instruction in EFL contexts and proposes implications for curriculum innovation, teacher professional development, and reading instruction research.Keywords: reading comprehension strategies, EFL reading instruction, language teacher cognition, teacher education
Procedia PDF Downloads 9027332 Prognostic Significance of Nuclear factor kappa B (p65) among Breast Cancer Patients in Cape Coast Teaching Hospital
Authors: Precious Barnes, Abraham Mensah, Leonard Derkyi-Kwarteng, Benjamin Amoani, George Adjei, Ernest Adankwah, Faustina Pappoe, Kwabena Dankwah, Daniel Amoako-Sakyi, Samuel Victor Nuvor, Dorcas Obiri-Yeboah, Ewura Seidu Yahaya, Patrick Kafui Akakpo, Roland Osei Saahene
Abstract:
Context: Breast cancer is a prevalent and aggressive type of cancer among African women, with high mortality rates in Ghana. Nuclear factor kappa B (NF-kB) is a transcription factor that has been associated with tumor progression in breast cancer. However, there is a lack of published data on NF-kB in breast cancer patients in Ghana or other African countries. Research Aim: The aim of this study was to assess the prognostic significance of NF-kB (p65) expression and its association with various clinicopathological features in breast cancer patients at the Cape Coast Teaching Hospital in Ghana. Methodology: A total of 90 formalin-fixed breast cancer tissues and 15 normal breast tissues were used in this study. The expression level of NF-kB (p65) was examined using immunohistochemical techniques. Correlation analysis between NF-kB (p65) expression and clinicopathological features was performed using SPSS version 25. Findings: The study found that NF-kB (p65) was expressed in 86.7% of breast cancer tissues. There was a significant relationship between NF-kB (p65) expression and tumor grade, proliferation index (Ki67), and molecular subtype. High-level expression of NF-kB (p65) was more common in tumor grade 3 compared to grade 1, and Ki67 > 20 had higher expression of NF-kB (p65) compared to Ki67 ≤ 20. Triple-negative breast cancer patients had the highest overexpression of NF-kB (p65) compared to other molecular subtypes. There was no significant association between NF-kB (p65) expression and other clinicopathological parameters. Theoretical Importance: This study provides important insights into the expression of NF-kB (p65) in breast cancer patients in Ghana, particularly in relation to tumor grade and proliferation index. The findings suggest that NF-kB (p65) could serve as a potential biological marker for cancer stage, progression, prognosis and as a therapeutic target. Data Collection and Analysis Procedures: Formalin-fixed breast cancer tissues and normal breast tissues were collected and analyzed using immunohistochemical techniques. Correlation analysis between NF-kB (p65) expression and clinicopathological features was performed using SPSS version 25. Question Addressed: This study addressed the question of the prognostic significance of NF-kB (p65) expression and its association with clinicopathological features in breast cancer patients in Ghana. Conclusion: This study, the first of its kind in Ghana, demonstrates that NF-kB (p65) is highly expressed among breast cancer patients at the Cape Coast Teaching Hospital, especially in triple-negative breast cancer patients. The expression of NF-kB (p65) is associated with tumor grade and proliferation index. NF-kB (p65) could potentially serve as a biological marker for cancer stage, progression, prognosis, and as a therapeutic target.Keywords: breast cancer, Ki67, NF-kB (p65), tumor grade
Procedia PDF Downloads 7227331 A Comprehensive Study of Spread Models of Wildland Fires
Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling
Procedia PDF Downloads 8127330 Regulation on the Protection of Personal Data Versus Quality Data Assurance in the Healthcare System Case Report
Authors: Elizabeta Krstić Vukelja
Abstract:
Digitization of personal data is a consequence of the development of information and communication technologies that create a new work environment with many advantages and challenges, but also potential threats to privacy and personal data protection. Regulation (EU) 2016/679 of the European Parliament and of the Council is becoming a law and obligation that should address the issues of personal data protection and information security. The existence of the Regulation leads to the conclusion that national legislation in the field of virtual environment, protection of the rights of EU citizens and processing of their personal data is insufficiently effective. In the health system, special emphasis is placed on the processing of special categories of personal data, such as health data. The healthcare industry is recognized as a particularly sensitive area in which a large amount of medical data is processed, the digitization of which enables quick access and quick identification of the health insured. The protection of the individual requires quality IT solutions that guarantee the technical protection of personal categories. However, the real problems are the technical and human nature and the spatial limitations of the application of the Regulation. Some conclusions will be drawn by analyzing the implementation of the basic principles of the Regulation on the example of the Croatian health care system and comparing it with similar activities in other EU member states.Keywords: regulation, healthcare system, personal dana protection, quality data assurance
Procedia PDF Downloads 3827329 Contrast Enhancement of Color Images with Color Morphing Approach
Authors: Javed Khan, Aamir Saeed Malik, Nidal Kamel, Sarat Chandra Dass, Azura Mohd Affandi
Abstract:
Low contrast images can result from the wrong setting of image acquisition or poor illumination conditions. Such images may not be visually appealing and can be difficult for feature extraction. Contrast enhancement of color images can be useful in medical area for visual inspection. In this paper, a new technique is proposed to improve the contrast of color images. The RGB (red, green, blue) color image is transformed into normalized RGB color space. Adaptive histogram equalization technique is applied to each of the three channels of normalized RGB color space. The corresponding channels in the original image (low contrast) and that of contrast enhanced image with adaptive histogram equalization (AHE) are morphed together in proper proportions. The proposed technique is tested on seventy color images of acne patients. The results of the proposed technique are analyzed using cumulative variance and contrast improvement factor measures. The results are also compared with decorrelation stretch. Both subjective and quantitative analysis demonstrates that the proposed techniques outperform the other techniques.Keywords: contrast enhacement, normalized RGB, adaptive histogram equalization, cumulative variance.
Procedia PDF Downloads 37827328 Improvement of Bearing Capacity of Soft Clay Using Geo-Cells
Authors: Siddhartha Paul, Aman Harlalka, Ashim K. Dey
Abstract:
Soft clayey soil possesses poor bearing capacity and high compressibility because of which foundations cannot be directly placed over soft clay. Normally pile foundations are constructed to carry the load through the soft soil up to the hard stratum below. Pile construction is costly and time consuming. In order to increase the properties of soft clay, many ground improvement techniques like stone column, preloading with and without sand drains/band drains, etc. are in vogue. Time is a constraint for successful application of these improvement techniques. Another way to improve the bearing capacity of soft clay and to reduce the settlement possibility is to apply geocells below the foundation. The geocells impart rigidity to the foundation soil, reduce the net load intensity on soil and thus reduce the compressibility. A well designed geocell reinforced soil may replace the pile foundation. The present paper deals with the applicability of geocells on improvement of the bearing capacity. It is observed that a properly designed geocell may increase the bearing capacity of soft clay up to two and a half times.Keywords: bearing capacity, geo-cell, ground improvement, soft clay
Procedia PDF Downloads 32227327 Parallel Vector Processing Using Multi Level Orbital DATA
Authors: Nagi Mekhiel
Abstract:
Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing
Procedia PDF Downloads 27027326 Reconstructability Analysis for Landslide Prediction
Authors: David Percy
Abstract:
Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.Keywords: reconstructability analysis, machine learning, landslides, raster analysis
Procedia PDF Downloads 6627325 Engineering Strategies Towards Improvement in Energy Storage Performance of Ceramic Capacitors for Pulsed Power Applications
Authors: Abdul Manan
Abstract:
The necessity for efficient and cost-effective energy storage devices to intelligently store the inconsistent energy output from modern renewable energy sources is peaked today. The scientific community is struggling to identify the appropriate material system for energy storage applications. Countless contributions by researchers worldwide have now helped us identify the possible snags and limitations associated with each material/method. Energy storage has attracted great attention for its use in portable electronic devices military field. Different devices, such as dielectric capacitors, supercapacitors, and batteries, are used for energy storage. Of these, dielectric capacitors have high energy output, a long life cycle, fast charging and discharging capabilities, work at high temperatures, and excellent fatigue resistance. The energy storage characteristics have been studied to be highly affected by various factors, such as grain size, optimized compositions, grain orientation, energy band gap, processing techniques, defect engineering, core-shell formation, interface engineering, electronegativity difference, the addition of additives, density, secondary phases, the difference of Pmax-Pr, sample thickness, area of the electrode, testing frequency, and AC/DC conditions. The data regarding these parameters/factors are scattered in the literature, and the aim of this study is to gather the data into a single paper that will be beneficial for new researchers in the field of interest. Furthermore, control over and optimizing these parameters will lead to enhancing the energy storage properties.Keywords: strategies, ceramics, energy storage, capacitors
Procedia PDF Downloads 7727324 Management of H. Armigera by Using Various Techniques
Authors: Ajmal Khan Kassi, Humayun Javed, Syed Abdul Qadeem
Abstract:
The study was conducted to find out the best management practices against American bollworm on Okra variety Arka Anamika during 2016. The three different management practices viz. Release of Trichogramma chilonis, hoeing and weeding, clipping and lufenuron insect growth regulator (IGR) which were tested individually and with all possible combinations for the controlling of American bollworm at 3 diverse areas viz. University Research Farm Koont, NARC and Farmer Field Taxila. All the treatment combinations regarding damage of fruit showed significant results. The minimum fruit infestation i.e. 3.20% and 3.58% was recorded with combined treatment (i.e. T. chilonis + hoeing + weeding + lufenuron) in two different localities. This combined treatment also resulted in maximum yield at NARC and Taxila i.e. 57.67 and 62.66 q/ha respectively. This treatment gave the best results to manage H. armigera. On the basis of different integrated pest management techniques, Arka Anamika variety proved to be comparatively resistant against H. armigera in different localities. So this variety is recommended for the cultivation in Pothwar region to get maximum yield.Keywords: management, american bollworm, arka anamika, okra
Procedia PDF Downloads 5527323 Data Analytics in Hospitality Industry
Authors: Tammy Wee, Detlev Remy, Arif Perdana
Abstract:
In the recent years, data analytics has become the buzzword in the hospitality industry. The hospitality industry is another example of a data-rich industry that has yet fully benefited from the insights of data analytics. Effective use of data analytics can change how hotels operate, market and position themselves competitively in the hospitality industry. However, at the moment, the data obtained by individual hotels remain under-utilized. This research is a preliminary research on data analytics in the hospitality industry, using an in-depth face-to-face interview on one hotel as a start to a multi-level research. The main case study of this research, hotel A, is a chain brand of international hotel that has been systematically gathering and collecting data on its own customer for the past five years. The data collection points begin from the moment a guest book a room until the guest leave the hotel premises, which includes room reservation, spa booking, and catering. Although hotel A has been gathering data intelligence on its customer for some time, they have yet utilized the data to its fullest potential, and they are aware of their limitation as well as the potential of data analytics. Currently, the utilization of data analytics in hotel A is limited in the area of customer service improvement, namely to enhance the personalization of service for each individual customer. Hotel A is able to utilize the data to improve and enhance their service which in turn, encourage repeated customers. According to hotel A, 50% of their guests returned to their hotel, and 70% extended nights because of the personalized service. Apart from using the data analytics for enhancing customer service, hotel A also uses the data in marketing. Hotel A uses the data analytics to predict or forecast the change in consumer behavior and demand, by tracking their guest’s booking preference, payment preference and demand shift between properties. However, hotel A admitted that the data they have been collecting was not fully utilized due to two challenges. The first challenge of using data analytics in hotel A is the data is not clean. At the moment, the data collection of one guest profile is meaningful only for one department in the hotel but meaningless for another department. Cleaning up the data and getting standards correctly for usage by different departments are some of the main concerns of hotel A. The second challenge of using data analytics in hotel A is the non-integral internal system. At the moment, the internal system used by hotel A do not integrate with each other well, limiting the ability to collect data systematically. Hotel A is considering another system to replace the current one for more comprehensive data collection. Hotel proprietors recognized the potential of data analytics as reported in this research, however, the current challenges of implementing a system to collect data come with a cost. This research has identified the current utilization of data analytics and the challenges faced when it comes to implementing data analytics.Keywords: data analytics, hospitality industry, customer relationship management, hotel marketing
Procedia PDF Downloads 179