Search results for: privacy and data protection law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26410

Search results for: privacy and data protection law

24880 Harnessing Environmental DNA to Assess the Environmental Sustainability of Commercial Shellfish Aquaculture in the Pacific Northwest United States

Authors: James Kralj

Abstract:

Commercial shellfish aquaculture makes significant contributions to the economy and culture of the Pacific Northwest United States. The industry faces intense pressure to minimize environmental impacts as a result of Federal policies like the Magnuson-Stevens Fisheries Conservation and Management Act and the Endangered Species Act. These policies demand the protection of essential fish habitat and declare several salmon species as endangered. Consequently, numerous projects related to the protection and rehabilitation of eelgrass beds, a crucial ecosystem for countless fish species, have been proposed at both state and federal levels. Both eelgrass beds and commercial shellfish farms occupy the same physical space, and therefore understanding the effects of shellfish aquaculture on eelgrass ecosystems has become a top ecological and economic priority of both government and industry. This study evaluates the organismal communities that eelgrass and oyster aquaculture habitats support. Water samples were collected from Willapa Bay, Washington; Tillamook Bay, Oregon; Humboldt Bay, California; and Sammish Bay, Washington to compare species diversity in eelgrass beds, oyster aquaculture plots, and boundary edges between these two habitats. Diversity was assessed using a novel technique: environmental DNA (eDNA). All organisms constantly shed small pieces of DNA into their surrounding environment through the loss of skin, hair, tissues, and waste. In the marine environment, this DNA becomes suspended in the water column allowing it to be easily collected. Once extracted and sequenced, this eDNA can be used to paint a picture of all the organisms that live in a particular habitat making it a powerful technology for environmental monitoring. Industry professionals and government officials should consider these findings to better inform future policies regulating eelgrass beds and oyster aquaculture. Furthermore, the information collected in this study may be used to improve the environmental sustainability of commercial shellfish aquaculture while simultaneously enhancing its growth and profitability in the face of ever-changing political and ecological landscapes.

Keywords: aquaculture, environmental DNA, shellfish, sustainability

Procedia PDF Downloads 243
24879 Survey on Arabic Sentiment Analysis in Twitter

Authors: Sarah O. Alhumoud, Mawaheb I. Altuwaijri, Tarfa M. Albuhairi, Wejdan M. Alohaideb

Abstract:

Large-scale data stream analysis has become one of the important business and research priorities lately. Social networks like Twitter and other micro-blogging platforms hold an enormous amount of data that is large in volume, velocity and variety. Extracting valuable information and trends out of these data would aid in a better understanding and decision-making. Multiple analysis techniques are deployed for English content. Moreover, one of the languages that produce a large amount of data over social networks and is least analyzed is the Arabic language. The proposed paper is a survey on the research efforts to analyze the Arabic content in Twitter focusing on the tools and methods used to extract the sentiments for the Arabic content on Twitter.

Keywords: big data, social networks, sentiment analysis, twitter

Procedia PDF Downloads 564
24878 Estimating Current Suicide Rates Using Google Trends

Authors: Ladislav Kristoufek, Helen Susannah Moat, Tobias Preis

Abstract:

Data on the number of people who have committed suicide tends to be reported with a substantial time lag of around two years. We examine whether online activity measured by Google searches can help us improve estimates of the number of suicide occurrences in England before official figures are released. Specifically, we analyse how data on the number of Google searches for the terms “depression” and “suicide” relate to the number of suicides between 2004 and 2013. We find that estimates drawing on Google data are significantly better than estimates using previous suicide data alone. We show that a greater number of searches for the term “depression” is related to fewer suicides, whereas a greater number of searches for the term “suicide” is related to more suicides. Data on suicide related search behaviour can be used to improve current estimates of the number of suicide occurrences.

Keywords: nowcasting, search data, Google Trends, official statistics

Procedia PDF Downloads 346
24877 Bioinformatic Design of a Non-toxic Modified Adjuvant from the Native A1 Structure of Cholera Toxin with Membrane Synthetic Peptide of Naegleria fowleri

Authors: Frida Carrillo Morales, Maria Maricela Carrasco Yépez, Saúl Rojas Hernández

Abstract:

Naegleria fowleri is the causative agent of primary amebic meningoencephalitis, this disease is acute and fulminant that affects humans. It has been reported that despite the existence of therapeutic options against this disease, its mortality rate is 97%. Therefore, the need arises to have vaccines that confer protection against this disease and, in addition to developing adjuvants to enhance the immune response. In this regard, in our work group, we obtained a peptide designed from the membrane protein MP2CL5 of Naegleria fowleri called Smp145 that was shown to be immunogenic; however, it would be of great importance to enhance its immunological response, being able to co-administer it with a non-toxic adjuvant. Therefore, the objective of this work was to carry out the bioinformatic design of a peptide of the Naegleria fowleri membrane protein MP2CL5 conjugated with a non-toxic modified adjuvant from the native A1 structure of Cholera Toxin. For which different bioinformatics tools were used to obtain a model with a modification in amino acid 61 of the A1 subunit of the CT (CTA1), to which the Smp145 peptide was added and both molecules were joined with a 13-glycine linker. As for the results obtained, the modification in CTA1 bound to the peptide produces a reduction in the toxicity of the molecule in in silico experiments, likewise, the prediction in the binding of Smp145 to the receptor of B cells suggests that the molecule is directed in specifically to the BCR receptor, decreasing its native enzymatic activity. The stereochemical evaluation showed that the generated model has a high number of adequately predicted residues. In the ERRAT test, the confidence with which it is possible to reject regions that exceed the error values was evaluated, in the generated model, a high score was obtained, which determines that the model has a good structural resolution. Therefore, the design of the conjugated peptide in this work will allow us to proceed with its chemical synthesis and subsequently be able to use it in the mouse meningitis protection model caused by N. fowleri.

Keywords: immunology, vaccines, pathogens, infectious disease

Procedia PDF Downloads 76
24876 On the Network Packet Loss Tolerance of SVM Based Activity Recognition

Authors: Gamze Uslu, Sebnem Baydere, Alper K. Demir

Abstract:

In this study, data loss tolerance of Support Vector Machines (SVM) based activity recognition model and multi activity classification performance when data are received over a lossy wireless sensor network is examined. Initially, the classification algorithm we use is evaluated in terms of resilience to random data loss with 3D acceleration sensor data for sitting, lying, walking and standing actions. The results show that the proposed classification method can recognize these activities successfully despite high data loss. Secondly, the effect of differentiated quality of service performance on activity recognition success is measured with activity data acquired from a multi hop wireless sensor network, which introduces high data loss. The effect of number of nodes on the reliability and multi activity classification success is demonstrated in simulation environment. To the best of our knowledge, the effect of data loss in a wireless sensor network on activity detection success rate of an SVM based classification algorithm has not been studied before.

Keywords: activity recognition, support vector machines, acceleration sensor, wireless sensor networks, packet loss

Procedia PDF Downloads 467
24875 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company

Authors: Rahma Saleh Hussein Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, CMMS

Procedia PDF Downloads 114
24874 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction

Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho

Abstract:

Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.

Keywords: computed tomography, computed laminography, compressive sending, low-dose

Procedia PDF Downloads 458
24873 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD

Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik

Abstract:

The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.

Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet

Procedia PDF Downloads 554
24872 Humanising Digital Healthcare to Build Capacity by Harnessing the Power of Patient Data

Authors: Durhane Wong-Rieger, Kawaldip Sehmi, Nicola Bedlington, Nicole Boice, Tamás Bereczky

Abstract:

Patient-generated health data should be seen as the expression of the experience of patients, including the outcomes reflecting the impact a treatment or service had on their physical health and wellness. We discuss how the healthcare system can reach a place where digital is a determinant of health - where data is generated by patients and is respected and which acknowledges their contribution to science. We explore the biggest barriers facing this. The International Experience Exchange with Patient Organisation’s Position Paper is based on a global patient survey conducted in Q3 2021 that received 304 responses. Results were discussed and validated by the 15 patient experts and supplemented with literature research. Results are a subset of this. Our research showed patient communities want to influence how their data is generated, shared, and used. Our study concludes that a reasonable framework is needed to protect the integrity of patient data and minimise abuse, and build trust. Results also demonstrated a need for patient communities to have more influence and control over how health data is generated, shared, and used. The results clearly highlight that the community feels there is a lack of clear policies on sharing data.

Keywords: digital health, equitable access, humanise healthcare, patient data

Procedia PDF Downloads 73
24871 Use of Machine Learning in Data Quality Assessment

Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho

Abstract:

Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.

Keywords: machine learning, data quality, quality dimension, quality assessment

Procedia PDF Downloads 139
24870 The ICC, International Criminal Justice and International Politics

Authors: Girma Y. Iyassu Menelik

Abstract:

The international community has gone through indescribable atrocities resulting from acts of war. These atrocities turned Europe and Africa into a wilderness of bloodshed and crime. In the period 1960- 1970s Africa witnessed unprecedented and well-documented assaults on life and property. This necessitated the adoption, signing and ratification of the International Criminal Court, establishment of the International Court of Justice which is a great achievement for the protection and fulfilling of human rights in the context of international political instability. The ICC came as an important opportunity to advance justice for serious crimes committed in violation of international law. Thus the Rome statute has become a formidable contribution to peace and security. There are concerns that the ICC is targeting African states. However, the ICC cannot preside over cases that are not parties to the Rome statute unless the UN Security council refers the situation or the relevant state asks the court to become involved. The instable international political situation thus deals with criminal prosecutions where amnesty is not permissible or is strongly repudiated. The court has become important justice instruments for states that are unable or unwilling to fulfill their obligation to address legacies of massive human rights violations. The ICJ as a court has a twofold role; to settle legal disputes submitted to it by states, and to give advisory opinions on legal questions referred to it by duly authorized United Nations organs and specialized agencies. All members of the UN are ipso facto parties to the statute of the ICJ. The court gives advisory opinion on any legal question. These courts are the most appropriate fora to pronounce on international crimes and are in a better position to know and apply international law. Cases that have been brought to the courts include Rwanda’s genocide, Liberia’s Charles Taylor etc. The receptiveness and cooperation of the local populations are important to the courts and if the ICC and ICJ can provide appropriate protections for the physical and economic safety of victims then peace and human rights observance can be attained. This paper will look into the effectiveness and impediments of these courts in handling criminal and injustices in international politics as while as what needs to be done to strengthen the capacity of these courts.

Keywords: ICC, international politics, justice, UN security council, violence, protection, fulfilling

Procedia PDF Downloads 436
24869 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 146
24868 The Responsible Lending Principle in the Spanish Proposal of the Mortgage Credit Act

Authors: Noelia Collado-Rodriguez

Abstract:

The Mortgage Credit Directive 2014/17/UE should have been transposed the 21st of March of 2016. However, in Spain not only we did not meet the deadline, but currently we just have a preliminary draft of the so-called Mortgage Credit Act. Before we analyze the preliminary draft from the standpoint of the responsible lending principle, we should point out that this preliminary draft is not a consumer law statute. Through the text of the preliminary draft we cannot see any reference to the consumer, but we see references to the borrower. Furthermore, and more important, the application of this statute would not be, according to its text, circumscribed to borrowers who address the credit to a personal purpose. Instead, it seems that the preliminary draft aims to be one more of the rules of banking transparency that already exists in the Spanish legislation. In this sense, we can also mention that the sanctions contained in the preliminary draft are referred to these laws of banking ordination and oversight – where the rules of banking transparency belong –. This might be against the spirit of the Mortgage Credit Directive, which allows the extension of its scope to credits aimed to acquire other immovable property beyond the residential one. However, the borrower has to be a consumer accordingly with the Directive. It is quite relevant that the prospective Spanish Mortgage Credit Act might not be a consumer protection statute; specially, from the perspective of the responsible lending principle. The responsible lending principle is a consumer law principle, which is based on the structural weakness of the consumer’s position in the relationship with the creditor. Therefore, it cannot surprise that the Spanish preliminary draft does not state any of the pre contractual conducts that express the responsible lending principle. We are referring to the lender’s duty to provide adequate explanations; the consumer’s suitability test; the lender’s duty to assess consumer’s creditworthiness; the consultation of databases to perform the creditworthiness assessment; and the most important, the lender’s prohibition to grant credit in case of a negative creditworthiness assessment. The preliminary draft just entitles the Economy Ministry to enact provisions related to those topics. Thus, the duties and rules derived from the responsible lending principle included in the EU Directive will not have legal character in Spain, being mere administrative regulations. To conclude, the two main questions that come up after reading the Spanish Mortgage Credit Act preliminary draft are, in the first place, what kind of consequences might arise from the Mortgage Credit Act if finally it is not a consumer law statute. And in the second place, what might be the consequences for the responsible lending principle of being developed by administrative regulations instead of by legislation.

Keywords: consumer credit, consumer protection, creditworthiness assessment, responsible lending

Procedia PDF Downloads 280
24867 Assessment of Population Trends of Birds at Taunsa Barrage Wildlife Sanctuary, Pakistan

Authors: Fehmeada Bibi, Shafqat Nawaz Qaisrani, Masood Akhtar, Zulfiqar Ali

Abstract:

Population trends learning is an important tool for conservation programs in rare as well as in common species of birds. A study was conducted to assess annual decline in species of birds and to identify the causes of this decline at Taunsa Barrage wildlife Sanctuary, Punjab, Pakistan. Data were collected by direct census method during wintering and breeding periods (2001 to 2002 and 2008 to 2011). The results indicated an increasing trend in 157, whereas a decreasing trend in 14 species of birds. Among the species with declining trend, there was a 92% decrease in White-backed Vulture (Gyps bengalensis), 60% in Greater Painted Snipe (Rostratula benghalensis), 57% in Garganey (Anas querquedula), Pallas’s Fish Eagle and Long-legged Buzzard (Buteo rufinus) 50% each, 41% in Grey Heron (Ardea cinerea), 39% in Little Cormorant (Phalacrocorax niger), 37% in Gadwall (Anas strepera), 33% in Marsh Harrier (Circus aeruginosus), 30% in Black Drongo (Dicrurus macrocercus) and 26% in Red-crested Pochard (Netta rufina) population. Habitat exploitation, hunting and grazing were found the main causes of this decline. In conclusion, conservation and management of the study area is foremost to interests of declining bird population. It is suggested, therefore, to take immediate steps for the protection of the sanctuary to conserve the declining population of birds.

Keywords: population trends, wildlife sanctuary, bird, habitat exploitation

Procedia PDF Downloads 264
24866 Nuclear Decay Data Evaluation for 217Po

Authors: S. S. Nafee, A. M. Al-Ramady, S. A. Shaheen

Abstract:

Evaluated nuclear decay data for the 217Po nuclide ispresented in the present work. These data include recommended values for the half-life T1/2, α-, β--, and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons has been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.

Keywords: nuclear decay data evaluation, mass evaluation, total converison coefficients, atomic mass evaluation

Procedia PDF Downloads 422
24865 The Effect of Technology on Human Rights Rules

Authors: Adel Fathy Sadek Abdalla

Abstract:

The issue of respect for human rights in Southeast Asia has become a major concern and is attracting the attention of the international community. Basically, the Association of Southeast Asian Nations (ASEAN) made human rights one of its main issues and in the ASEAN Charter in 2008. Subsequently, the Intergovernmental Commission on Human Rights ASEAN Human Rights (AICHR) was established. AICHR is the Southeast Asia Human Rights Enforcement Commission charged with the responsibilities, functions and powers to promote and protect human rights. However, at the end of 2016, the protective function assigned to the AICHR was not yet fulfilled. This is shown by several cases of human rights violations that are still ongoing and have not yet been solved. One case that has recently come to light is human rights violations against the Rohingya people in Myanmar. Using a legal-normative approach, the study examines the urgency of establishing a human rights tribunal in Southeast Asia capable of making a decision binding on ASEAN members or guilty parties. Data shows ASEAN needs regional courts to deal with human rights abuses in the ASEAN region. In addition, the study also highlights three important factors that ASEAN should consider when establishing a human rights tribunal, namely: Volume. a significant difference in terms of democracy and human rights development among the members, a consistent implementation of the principle of non-interference and the financial issue of the continuation of the court.

Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.

Procedia PDF Downloads 30
24864 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information

Authors: I. Nyoman Mahayasa Adiputra

Abstract:

Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.

Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city

Procedia PDF Downloads 119
24863 Inclusive Practices in Health Sciences: Equity Proofing Higher Education Programs

Authors: Mitzi S. Brammer

Abstract:

Given that the cultural make-up of programs of study in institutions of higher learning is becoming increasingly diverse, much has been written about cultural diversity from a university-level perspective. However, there are little data in the way of specific programs and how they address inclusive practices when teaching and working with marginalized populations. This research study aimed to discover baseline knowledge and attitudes of health sciences faculty, instructional staff, and students related to inclusive teaching/learning and interactions. Quantitative data were collected via an anonymous online survey (one designed for students and another designed for faculty/instructional staff) using a web-based program called Qualtrics. Quantitative data were analyzed amongst the faculty/instructional staff and students, respectively, using descriptive and comparative statistics (t-tests). Additionally, some participants voluntarily engaged in a focus group discussion in which qualitative data were collected around these same variables. Collecting qualitative data to triangulate the quantitative data added trustworthiness to the overall data. The research team analyzed collected data and compared identified categories and trends, comparing those data between faculty/staff and students, and reported results as well as implications for future study and professional practice.

Keywords: inclusion, higher education, pedagogy, equity, diversity

Procedia PDF Downloads 53
24862 An Analysis of Sequential Pattern Mining on Databases Using Approximate Sequential Patterns

Authors: J. Suneetha, Vijayalaxmi

Abstract:

Sequential Pattern Mining involves applying data mining methods to large data repositories to extract usage patterns. Sequential pattern mining methodologies used to analyze the data and identify patterns. The patterns have been used to implement efficient systems can recommend on previously observed patterns, in making predictions, improve usability of systems, detecting events, and in general help in making strategic product decisions. In this paper, identified performance of approximate sequential pattern mining defines as identifying patterns approximately shared with many sequences. Approximate sequential patterns can effectively summarize and represent the databases by identifying the underlying trends in the data. Conducting an extensive and systematic performance over synthetic and real data. The results demonstrate that ApproxMAP effective and scalable in mining large sequences databases with long patterns.

Keywords: multiple data, performance analysis, sequential pattern, sequence database scalability

Procedia PDF Downloads 328
24861 Efficacy of Coconut Shell Pyrolytic Oil Distillate in Protecting Wood Against Bio-Deterioration

Authors: K. S. Shiny, R. Sundararaj

Abstract:

Coconut trees (Cocos nucifera L.) are grown in many parts of India and world because of its multiple utilities. During pyrolysis, coconut shells yield oil, which is a dark thick liquid. Upon simple distillation it produces a more or less colourless liquid, termed coconut shell pyrolytic oil distillate (CSPOD). This manuscript reports and discusses the use of coconut shell pyrolytic oil distillate as a potential wood protectant against bio-deterioration. Since botanical products as ecofriendly wood protectant is being tested worldwide, the utilization of CPSOD as wood protectant is of great importance. The efficacy of CSPOD as wood protectant was evaluated as per Bureau of Indian Standards (BIS) in terms of its antifungal, antiborer, and termiticidal activities. Specimens of Rubber wood (Hevea brasiliensis) in six replicate each for two treatment methods namely spraying and dipping (48hrs) were employed. CSPOD was found to impart total protection against termites for six months compared to control under field conditions. For assessing the efficacy of CSPOD against fungi, the treated blocks were subjected to the attack of two white rot fungi Tyromyces versicolor (L.) Fr. and Polyporus sanguineus (L.) G. Mey and two brown rot fungi, Polyporus meliae (Undrew.) Murrill. and Oligoporus placenta (Fr.) Gilb. & Ryvarden. Results indicated that treatment with CSPOD significantly protected wood from the damage caused by the decay fungi. Efficacy of CSPOD against wood borer Lyctus africanus Lesne was carried out using six pairs of male and female beetles and it gave promising results in protecting the treated wood blocks when compared to control blocks. As far as the treatment methods were concerned, dip treatment was found to be more effective when compared to spraying. The results of the present investigation indicated that CSPOD is a promising botanical compound which has the potential to replace synthetic wood protectants. As coconut shell, pyrolytic oil is a waste byproduct of coconut shell charcoal industry, its utilization as a wood preservative will expand the economic returns from such industries.

Keywords: coconut shell pyrolytic oil distillate, eco-friendly wood protection, termites, wood borers, wood decay fungi

Procedia PDF Downloads 359
24860 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 382
24859 The Importance of Intellectual Property for Universities of Technology in South Africa: Challenges Faced and Proposed Way Forward

Authors: Martha E. Ikome, John M. Ikome

Abstract:

Intellectual property should be a day-to-day business decision due to its value, but increasingly, a number of institution are still not aware of the importance. Intellectual Property (IP) and its value are often not adequately appreciated. In the increasingly knowledge-driven economy, IP is a key consideration in day-to-day business decisions because new ideas and products appear almost daily in the market, which results in continuous innovation and research. Therefore, this paper will focus on the importance of IP for universities of technology and also further demonstrates how IP can become an economic tool and the challenges faced by these universities in implementing an IP system.

Keywords: intellectual property, institutions, challenges, protection

Procedia PDF Downloads 363
24858 Protection of the Valves against AC Faults Using the Fast-Acting HVDC Controls

Authors: Mesbah Tarek, Kelaiaia Samia, Chiheb Sofien, Kelaiaia Mounia Samira, Labar Hocine

Abstract:

Short circuit causes important damage in power systems. The aim of this paper is the investigation of the effect of short circuit at the AC side inverter in HVDC transmission line. The cutoff of HVDC transmission line implies important economic losses. In this paper it is proposed an efficient procedure which can protect and eliminate the fault quickly. The theoretical development and simulation are well detailed and illustrated.

Keywords: AC inverter, HVDC, short circuit, switcher gate, power system

Procedia PDF Downloads 551
24857 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift

Procedia PDF Downloads 307
24856 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company

Authors: Rahma Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS

Procedia PDF Downloads 200
24855 Implementation of Environmental Sustainability into Event Management

Authors: Özlem Küçükakça

Abstract:

The world population is rapidly growing. In the last few decades, environmental protection and climate change have been remarked as a global concern. All events have their own ecological footprint. Therefore, all participants who take part in the events, from event organizer to audience should be responsible for reducing carbon emissions. Currently, there is a literature gap which investigates the relationship between events and environment. Hence, this study is conducted to investigate how to implement environmental sustainability in the event management. Therefore, a wide literature and also the UK festivals database have been investigated. Finally, environmental effects and the solution of reducing impacts at events were discussed.

Keywords: ecological footprint, environmental sustainability, events, sustainability

Procedia PDF Downloads 295
24854 Evaluation of Practicality of On-Demand Bus Using Actual Taxi-Use Data through Exhaustive Simulations

Authors: Jun-ichi Ochiai, Itsuki Noda, Ryo Kanamori, Keiji Hirata, Hitoshi Matsubara, Hideyuki Nakashima

Abstract:

We conducted exhaustive simulations for data assimilation and evaluation of service quality for various setting in a new shared transportation system, called SAVS. Computational social simulation is a key technology to design recent social services like SAVS as new transportation service. One open issue in SAVS was to determine the service scale through the social simulation. Using our exhaustive simulation framework, OACIS, we did data-assimilation and evaluation of effects of SAVS based on actual tax-use data at Tajimi city, Japan. Finally, we get the conditions to realize the new service in a reasonable service quality.

Keywords: on-demand bus sytem, social simulation, data assimilation, exhaustive simulation

Procedia PDF Downloads 302
24853 Hardness map of Human Tarsals, Meta Tarsals and Phalanges of Toes

Authors: Irfan Anjum Manarvi, Zahid Ali kaimkhani

Abstract:

Predicting location of the fracture in human bones has been a keen area of research for the past few decades. A variety of tests for hardness, deformation, and strain field measurement have been conducted in the past; but considered insufficient due to various limitations. Researchers, therefore, have proposed further studies due to inaccuracies in measurement methods, testing machines, and experimental errors. Advancement and availability of hardware, measuring instrumentation, and testing machines can now provide remedies to these limitations. The human foot is a critical part of the body exposed to various forces throughout its life. A number of products are developed for using it for protection and care, which many times do not provide sufficient protection and may itself become a source of stress due to non-consideration of the delicacy of bones in the feet. A continuous strain or overloading on feet may occur resulting to discomfort and even fracture. Mechanical properties of Tarsals, Metatarsals, and phalanges are, therefore, the primary area of consideration for all such design applications. Hardness is one of the mechanical properties which are considered very important to establish the mechanical resistance behavior of a material against applied loads. Past researchers have worked in the areas of investigating mechanical properties of these bones. However, their results were based on a limited number of experiments and taking average values of hardness due to either limitation of samples or testing instruments. Therefore, they proposed further studies in this area. The present research has been carried out to develop a hardness map of the human foot by measuring micro hardness at various locations of these bones. Results are compiled in the form of distance from a reference point on a bone and the hardness values for each surface. The number of test results is far more than previous studies and are spread over a typical bone to give a complete hardness map of these bones. These results could also be used to establish other properties such as stress and strain distribution in the bones. Also, industrial engineers could use it for design and development of various accessories for human feet health care and comfort and further research in the same areas.

Keywords: tarsals, metatarsals, phalanges, hardness testing, biomechanics of human foot

Procedia PDF Downloads 416
24852 Ecological Effect on Aphid Population in Safflower Crop

Authors: Jan M. Mari

Abstract:

Safflower is a renowned drought tolerant oil seed crop. Previously its flowers were used for cooking and herbal medicines in China and it was cultivated by small growers for his personal needs of oil. A field study was conducted at experimental field, faculty of crop protection, Sindh Agricultural University Tandojam, during winter, 2012-13, to observe ecological effect on aphid population in safflower crop. Aphid population gradually increased with the growth of safflower. It developed with maximum aphid per leaf on 3rd week of February and it decreased in March as crop matured. A non-significant interaction was found with temperature of aphid, zigzag and hoverfly, respectively and a highly significant interaction with temperature was found with 7-spotted, lacewing, 9-spotted, and Brumus, respectively. The data revealed the overall mean population of zigzag was highest, followed by 9-spotted, 7-spotted, lace wing, hover fly and Brumus, respectively. In initial time the predator and prey ratio indicated that there was not a big difference between predator and prey ratio. After January 1st, the population of aphid increased suddenly until 18th February and it established a significant difference between predator prey ratios. After that aphid population started decreasing and it affected ratio between pest and predators. It is concluded that biotic factors, 7-spotted, zigzag, 9-spotted Brumus and lacewing exhibited a strong and positive correlation with aphid population. It is suggested that aphid pest should be monitored regularly and before reaching economic threshold level augmentation of natural enemies may be managed.

Keywords: aphid, ecology, population, safflower

Procedia PDF Downloads 250
24851 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 277