Search results for: data integrity and privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25452

Search results for: data integrity and privacy

24132 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data

Authors: Nasser A. Al-Azri

Abstract:

The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.

Keywords: bioclimatic charts, passive cooling, TMY, weather data

Procedia PDF Downloads 236
24131 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach

Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon

Abstract:

Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.

Keywords: data mining, defensive m&s, management system, knowledge management

Procedia PDF Downloads 244
24130 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 232
24129 Ideological and Poetological Tensions: Wu Mi’s Enterprise of Imitating and Translating George Gordon Byron

Authors: Hanjin Yan

Abstract:

The English Romantic George Gordon Byron (1788-1824) was widely celebrated by men of letters in early republican China as a Satanic freedom fighter challenging classical poetics and traditional values. However, Wu Mi (1894-1978), the most persistent critic of contemporary iconoclasm, perceived Byron as a paragon of self-righteous poet-exiles who maintained moral integrity and achieved poetic excellence during times of frustration, just like canonized classical Chinese poets. Wu Mi not only composed lengthy imitations of the third canto of Byron’s Childe Harold’s Pilgrimage (1816) but also patronized a rendering of the canto. Taking André Lefevere’s rewriting theory as a framework, this paper explores the interplay of ideology and poetics by examining Wu Mi’s imitations against Byron’s original and its Chinese translation patronized by Wu Mi. Textual analysis shows that Wu Mi’s approach to Byron’s poetry was informed not only by his endeavor to invigorate classical Chinese poetics, but also by his program to preserve China’s cultural traditions and integrate Western new humanism, a theory proposed by his Harvard mentor Irving Babbitt (1865-1933). This study reveals how Byron was appropriated to serve conflicting poetic and ideological purposes in early republican China and suggests that imitation as a type of rewriting merits further attention.

Keywords: George Gordon Byron, ideology, imitation, poetics, translation

Procedia PDF Downloads 220
24128 Imputation of Urban Movement Patterns Using Big Data

Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson

Abstract:

Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.

Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population

Procedia PDF Downloads 224
24127 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 31
24126 The Influence of Housing Choice Vouchers on the Private Rental Market

Authors: Randy D. Colon

Abstract:

Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.

Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market

Procedia PDF Downloads 112
24125 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 58
24124 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation

Procedia PDF Downloads 334
24123 Influence of Annealing on the Mechanical Properties of Polyester-Cotton Friction Spun Yarn

Authors: Sujit Kumar Sinha, R. Chattopadhyay

Abstract:

In the course of processing phases and use, fibres, yarns, or fabrics are subjected to a variety of stresses and strains, which cause the development of internal stresses. Given an opportunity, these inherent stresses try to bring back the structure to the original state. As an example, a twisted yarn always shows a tendency to untwist whenever its one end is made free. If the yarn is not held under tension, it may form snarls due to the presence of excessive torque. The running performance of such yarn or thread may, therefore, get negatively affected by it, as a snarl may not pass through the knitting or sewing needle smoothly, leading to an end break. A fabric shows a tendency to form wrinkles whenever squeezed. It may also shrink when brought to a relaxed state. In order to improve performance (i.e., dimensional stability or appearance), stabilization of the structure is needed. The stabilization can be attained through the release of internal stresses, which can be brought about by the process of annealing and/or other finishing treatments. When a fabric is subjected to heat, a change in the properties of the fibers, yarns, and fabric is expected. The degree to which the properties are affected would depend upon the condition of heat treatment and on the properties & structure of fibres, yarns, and fabric. In the present study, an attempt has been made to investigate the effect of annealing treatment on the properties of polyester cotton yarns with varying sheath structures.

Keywords: friction spun yarn, annealing, tenacity, structural integrity, decay

Procedia PDF Downloads 54
24122 Social Data Aggregator and Locator of Knowledge (STALK)

Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat

Abstract:

Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.

Keywords: social network, analysis, Facebook, Linkedin, git, big data

Procedia PDF Downloads 436
24121 A Framework for Incorporating Non-Linear Degradation of Conductive Adhesive in Environmental Testing

Authors: Kedar Hardikar, Joe Varghese

Abstract:

Conductive adhesives have found wide-ranging applications in electronics industry ranging from fixing a defective conductor on printed circuit board (PCB) attaching an electronic component in an assembly to protecting electronics components by the formation of “Faraday Cage.” The reliability requirements for the conductive adhesive vary widely depending on the application and expected product lifetime. While the conductive adhesive is required to maintain the structural integrity, the electrical performance of the associated sub-assembly can be affected by the degradation of conductive adhesive. The degradation of the adhesive is dependent upon the highly varied use case. The conventional approach to assess the reliability of the sub-assembly involves subjecting it to the standard environmental test conditions such as high-temperature high humidity, thermal cycling, high-temperature exposure to name a few. In order to enable projection of test data and observed failures to predict field performance, systematic development of an acceleration factor between the test conditions and field conditions is crucial. Common acceleration factor models such as Arrhenius model are based on rate kinetics and typically rely on an assumption of linear degradation in time for a given condition and test duration. The application of interest in this work involves conductive adhesive used in an electronic circuit of a capacitive sensor. The degradation of conductive adhesive in high temperature and humidity environment is quantified by the capacitance values. Under such conditions, the use of established models such as Hallberg-Peck model or Eyring Model to predict time to failure in the field typically relies on linear degradation rate. In this particular case, it is seen that the degradation is nonlinear in time and exhibits a square root t dependence. It is also shown that for the mechanism of interest, the presence of moisture is essential, and the dominant mechanism driving the degradation is the diffusion of moisture. In this work, a framework is developed to incorporate nonlinear degradation of the conductive adhesive for the development of an acceleration factor. This method can be extended to applications where nonlinearity in degradation rate can be adequately characterized in tests. It is shown that depending on the expected product lifetime, the use of conventional linear degradation approach can overestimate or underestimate the field performance. This work provides guidelines for suitability of linear degradation approximation for such varied applications

Keywords: conductive adhesives, nonlinear degradation, physics of failure, acceleration factor model.

Procedia PDF Downloads 127
24120 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm

Authors: Kamel Belammi, Houria Fatrim

Abstract:

imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.

Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes

Procedia PDF Downloads 527
24119 An Historical Revision of Change and Configuration Management Process

Authors: Expedito Pinto De Paula Junior

Abstract:

Current systems such as artificial satellites, airplanes, automobiles, turbines, power systems and air traffic controls are becoming increasingly more complex and/or highly integrated as defined in SAE-ARP-4754A (Society Automotive Engineering - Certification considerations for highly-integrated or complex aircraft systems standard). Among other processes, the development of such systems requires careful Change and Configuration Management (CCM) to establish and maintain product integrity. Understand the maturity of CCM process based in historical approach is crucial for better implementation in hardware and software lifecycle. The sense of work organization, in all fields of development is directly related to the order and interrelation of the parties, changes in time, and record of these changes. Generally, is observed that engineers, administrators and managers invest more time in technical activities than in organization of work. More these professionals are focused in solving complex problems with a purely technical bias. CCM process is fundamental for development, production and operation of new products specially in the safety critical systems. The objective of this paper is open a discussion about the historical revision based in standards focus of CCM around the world in order to understand and reflect the importance across the years, the contribution of this process for technology evolution, to understand the mature of organizations in the system lifecycle project and the benefits of CCM to avoid errors and mistakes during the Lifecycle Product.

Keywords: changes, configuration management, historical, revision

Procedia PDF Downloads 195
24118 Review of the Road Crash Data Availability in Iraq

Authors: Abeer K. Jameel, Harry Evdorides

Abstract:

Iraq is a middle income country where the road safety issue is considered one of the leading causes of deaths. To control the road risk issue, the Iraqi Ministry of Planning, General Statistical Organization started to organise a collection system of traffic accidents data with details related to their causes and severity. These data are published as an annual report. In this paper, a review of the available crash data in Iraq will be presented. The available data represent the rate of accidents in aggregated level and classified according to their types, road users’ details, and crash severity, type of vehicles, causes and number of causalities. The review is according to the types of models used in road safety studies and research, and according to the required road safety data in the road constructions tasks. The available data are also compared with the road safety dataset published in the United Kingdom as an example of developed country. It is concluded that the data in Iraq are suitable for descriptive and exploratory models, aggregated level comparison analysis, and evaluation and monitoring the progress of the overall traffic safety performance. However, important traffic safety studies require disaggregated level of data and details related to the factors of the likelihood of traffic crashes. Some studies require spatial geographic details such as the location of the accidents which is essential in ranking the roads according to their level of safety, and name the most dangerous roads in Iraq which requires tactic plan to control this issue. Global Road safety agencies interested in solve this problem in low and middle-income countries have designed road safety assessment methodologies which are basing on the road attributes data only. Therefore, in this research it is recommended to use one of these methodologies.

Keywords: road safety, Iraq, crash data, road risk assessment, The International Road Assessment Program (iRAP)

Procedia PDF Downloads 250
24117 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method

Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens

Abstract:

Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.

Keywords: healthcare, knowledge acquisition, maximal data sets, action design science

Procedia PDF Downloads 333
24116 Tool for Metadata Extraction and Content Packaging as Endorsed in OAIS Framework

Authors: Payal Abichandani, Rishi Prakash, Paras Nath Barwal, B. K. Murthy

Abstract:

Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.

Keywords: digital preservation, metadata, OAIS, PDI, XML

Procedia PDF Downloads 388
24115 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 150
24114 An Improved Parallel Algorithm of Decision Tree

Authors: Jiameng Wang, Yunfei Yin, Xiyu Deng

Abstract:

Parallel optimization is one of the important research topics of data mining at this stage. Taking Classification and Regression Tree (CART) parallelization as an example, this paper proposes a parallel data mining algorithm based on SSP-OGini-PCCP. Aiming at the problem of choosing the best CART segmentation point, this paper designs an S-SP model without data association; and in order to calculate the Gini index efficiently, a parallel OGini calculation method is designed. In addition, in order to improve the efficiency of the pruning algorithm, a synchronous PCCP pruning strategy is proposed in this paper. In this paper, the optimal segmentation calculation, Gini index calculation, and pruning algorithm are studied in depth. These are important components of parallel data mining. By constructing a distributed cluster simulation system based on SPARK, data mining methods based on SSP-OGini-PCCP are tested. Experimental results show that this method can increase the search efficiency of the best segmentation point by an average of 89%, increase the search efficiency of the Gini segmentation index by 3853%, and increase the pruning efficiency by 146% on average; and as the size of the data set increases, the performance of the algorithm remains stable, which meets the requirements of contemporary massive data processing.

Keywords: classification, Gini index, parallel data mining, pruning ahead

Procedia PDF Downloads 116
24113 Raman Spectral Fingerprints of Healthy and Cancerous Human Colorectal Tissues

Authors: Maria Karnachoriti, Ellas Spyratou, Dimitrios Lykidis, Maria Lambropoulou, Yiannis S. Raptis, Ioannis Seimenis, Efstathios P. Efstathopoulos, Athanassios G. Kontos

Abstract:

Colorectal cancer is the third most common cancer diagnosed in Europe, according to the latest incidence data provided by the World Health Organization (WHO), and early diagnosis has proved to be the key in reducing cancer-related mortality. In cases where surgical interventions are required for cancer treatment, the accurate discrimination between healthy and cancerous tissues is critical for the postoperative care of the patient. The current study focuses on the ex vivo handling of surgically excised colorectal specimens and the acquisition of their spectral fingerprints using Raman spectroscopy. Acquired data were analyzed in an effort to discriminate, in microscopic scale, between healthy and malignant margins. Raman spectroscopy is a spectroscopic technique with high detection sensitivity and spatial resolution of few micrometers. The spectral fingerprint which is produced during laser-tissue interaction is unique and characterizes the biostructure and its inflammatory or cancer state. Numerous published studies have demonstrated the potential of the technique as a tool for the discrimination between healthy and malignant tissues/cells either ex vivo or in vivo. However, the handling of the excised human specimens and the Raman measurement conditions remain challenging, unavoidably affecting measurement reliability and repeatability, as well as the technique’s overall accuracy and sensitivity. Therefore, tissue handling has to be optimized and standardized to ensure preservation of cell integrity and hydration level. Various strategies have been implemented in the past, including the use of balanced salt solutions, small humidifiers or pump-reservoir-pipette systems. In the current study, human colorectal specimens of 10X5 mm were collected from 5 patients up to now who underwent open surgery for colorectal cancer. A novel, non-toxic zinc-based fixative (Z7) was used for tissue preservation. Z7 demonstrates excellent protein preservation and protection against tissue autolysis. Micro-Raman spectra were recorded with a Renishaw Invia spectrometer from successive random 2 micrometers spots upon excitation at 785 nm to decrease fluorescent background and secure avoidance of tissue photodegradation. A temperature-controlled approach was adopted to stabilize the tissue at 2 °C, thus minimizing dehydration effects and consequent focus drift during measurement. A broad spectral range, 500-3200 cm-1,was covered with five consecutive full scans that lasted for 20 minutes in total. The average spectra were used for least square fitting analysis of the Raman modes.Subtle Raman differences were observed between normal and cancerous colorectal tissues mainly in the intensities of the 1556 cm-1 and 1628 cm-1 Raman modes which correspond to v(C=C) vibrations in porphyrins, as well as in the range of 2800-3000 cm-1 due to CH2 stretching of lipids and CH3 stretching of proteins. Raman spectra evaluation was supported by histological findings from twin specimens. This study demonstrates that Raman spectroscopy may constitute a promising tool for real-time verification of clear margins in colorectal cancer open surgery.

Keywords: colorectal cancer, Raman spectroscopy, malignant margins, spectral fingerprints

Procedia PDF Downloads 90
24112 Child Trafficking for Adoption Purposes: A Study into the Criminogenic Factors of the German Intercountry Adoption System

Authors: Elvira Loibl

Abstract:

In Western countries, the demand for adoptable children, especially healthy babies, has been considerably high for several years. Rising infertility rates, liberal abortion politics, the widespread use of contraception, and the increasing acceptance of unmarried motherhood are factors that have decreased the number of infants available for domestic adoption in the U.S. and Europe. As a consequence, many involuntarily childless couples turn to intercountry adoption as a viable alternative to have a child of their own. However, the demand for children far outpaces the supply of orphans with the desired characteristics. The imbalance between the number of prospective adopters and the children available for intercountry adoption results in long waiting lists and high prices. The inordinate sums of money involved in the international adoption system have created a commercial ‘underbelly’ where unethical and illicit practices are employed to provide the adoption market with adoptable children. Children are being purchased or abducted from their families, hospitals or child care institutions and then trafficked to receiving countries as ‘orphans’. This paper aims to uncover and explain the factors of the German adoption system that are conducive to child trafficking for adoption purposes. It explains that the tension between money and integrity as experienced by German adoption agencies, blind trust in the authorities in the sending countries as well as a lenient control system encourage and facilitate the trafficking in children to Germany.

Keywords: child trafficking, intercountry adoption, market in adoptable babies, German adoption system

Procedia PDF Downloads 283
24111 Addressing Supply Chain Data Risk with Data Security Assurance

Authors: Anna Fowler

Abstract:

When considering assets that may need protection, the mind begins to contemplate homes, cars, and investment funds. In most cases, the protection of those assets can be covered through security systems and insurance. Data is not the first thought that comes to mind that would need protection, even though data is at the core of most supply chain operations. It includes trade secrets, management of personal identifiable information (PII), and consumer data that can be used to enhance the overall experience. Data is considered a critical element of success for supply chains and should be one of the most critical areas to protect. In the supply chain industry, there are two major misconceptions about protecting data: (i) We do not manage or store confidential/personally identifiable information (PII). (ii) Reliance on Third-Party vendor security. These misconceptions can significantly derail organizational efforts to adequately protect data across environments. These statistics can be exciting yet overwhelming at the same time. The first misconception, “We do not manage or store confidential/personally identifiable information (PII)” is dangerous as it implies the organization does not have proper data literacy. Enterprise employees will zero in on the aspect of PII while neglecting trade secret theft and the complete breakdown of information sharing. To circumvent the first bullet point, the second bullet point forges an ideology that “Reliance on Third-Party vendor security” will absolve the company from security risk. Instead, third-party risk has grown over the last two years and is one of the major causes of data security breaches. It is important to understand that a holistic approach should be considered when protecting data which should not involve purchasing a Data Loss Prevention (DLP) tool. A tool is not a solution. To protect supply chain data, start by providing data literacy training to all employees and negotiating the security component of contracts with vendors to highlight data literacy training for individuals/teams that may access company data. It is also important to understand the origin of the data and its movement to include risk identification. Ensure processes effectively incorporate data security principles. Evaluate and select DLP solutions to address specific concerns/use cases in conjunction with data visibility. These approaches are part of a broader solutions framework called Data Security Assurance (DSA). The DSA Framework looks at all of the processes across the supply chain, including their corresponding architecture and workflows, employee data literacy, governance and controls, integration between third and fourth-party vendors, DLP as a solution concept, and policies related to data residency. Within cloud environments, this framework is crucial for the supply chain industry to avoid regulatory implications and third/fourth party risk.

Keywords: security by design, data security architecture, cybersecurity framework, data security assurance

Procedia PDF Downloads 83
24110 A Novel Hybrid Lubri-Coolant for Machining Difficult-to-Cut Ti-6Al-4V Alloy

Authors: Muhammad Jamil, Ning He, Wei Zhao

Abstract:

It is a rough estimation that the aerospace companies received orders of 37000 new aircraft, including the air ambulances, until 2037. And titanium alloys have a 15% contribution in modern aircraft's manufacturing owing to the high strength/weight ratio. Despite their application in the aerospace and medical equipment manufacturing industry, still, their high-speed machining puts a challenge in terms of tool wear, heat generation, and poor surface quality. Among titanium alloys, Ti-6Al-4V is the major contributor to aerospace application. However, its poor thermal conductivity (6.7W/mK) accumulates shear and friction heat at the tool-chip interface zone. To dissipate the heat generation and friction effect, cryogenic cooling, Minimum quantity lubrication (MQL), nanofluids, hybrid cryogenic-MQL, solid lubricants, etc., are applied frequently to underscore their significant effect on improving the machinability of Ti-6Al-4V. Nowadays, hybrid lubri-cooling is getting attention from researchers to explore their effect regarding the hard-to-cut Ti-6Al-4V. Therefore, this study is devoted to exploring the effect of hybrid ethanol-ester oil MQL regarding the cutting temperature, surface integrity, and tool life. As the ethanol provides -OH group and ester oil of long-chain molecules provide a tribo-film on the tool-workpiece interface. This could be a green manufacturing alternative for the manufacturing industry.

Keywords: hybrid lubri-cooling, surface roughness, tool wear, MQL

Procedia PDF Downloads 79
24109 Semi-Supervised Outlier Detection Using a Generative and Adversary Framework

Authors: Jindong Gu, Matthias Schubert, Volker Tresp

Abstract:

In many outlier detection tasks, only training data belonging to one class, i.e., the positive class, is available. The task is then to predict a new data point as belonging either to the positive class or to the negative class, in which case the data point is considered an outlier. For this task, we propose a novel corrupted Generative Adversarial Network (CorGAN). In the adversarial process of training CorGAN, the Generator generates outlier samples for the negative class, and the Discriminator is trained to distinguish the positive training data from the generated negative data. The proposed framework is evaluated using an image dataset and a real-world network intrusion dataset. Our outlier-detection method achieves state-of-the-art performance on both tasks.

Keywords: one-class classification, outlier detection, generative adversary networks, semi-supervised learning

Procedia PDF Downloads 144
24108 Crops Cold Stress Alleviation by Silicon: Application on Turfgrass

Authors: Taoufik Bettaieb, Sihem Soufi

Abstract:

As a bioactive metalloid, silicon (Si) is an essential element for plant growth and development. It also plays a crucial role in enhancing plants’ resilience to different abiotic and biotic stresses. The morpho-physiological, biochemical, and molecular background of Si-mediated stress tolerance in plants were unraveled. Cold stress is a severe abiotic stress response to the decrease of plant growth and yield by affecting various physiological activities in plants. Several approaches have been used to alleviate the adverse effects generated from cold stress exposure, but the cost-effective, environmentally friendly, and defensible approach is the supply of silicon. Silicon has the ability to neutralize the harmful impacts of cold stress. Therefore, based on these hypotheses, this study was designed in order to investigate the morphological and physiological background of silicon effects applied at different concentrations on cold stress mitigation during early growth of a turfgrass, namely Paspalum vaginatum Sw. Results show that silicon applied at different concentrations improved the morphological development of Paspalum subjected to cold stress. It is also effective on the photosynthetic apparatus by maintaining stability the photochemical efficiency. As the primary component of cellular membranes, lipids play a critical function in maintaining the structural integrity of plant cells. Silicon application decreased membrane lipid peroxidation and kept on membrane frontline barrier relatively stable under cold stress.

Keywords: crops, cold stress, silicon, abiotic stress

Procedia PDF Downloads 119
24107 Testing the Change in Correlation Structure across Markets: High-Dimensional Data

Authors: Malay Bhattacharyya, Saparya Suresh

Abstract:

The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.

Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition

Procedia PDF Downloads 119
24106 Development and Evaluation of a Portable Ammonia Gas Detector

Authors: Jaheon Gu, Wooyong Chung, Mijung Koo, Seonbok Lee, Gyoutae Park, Sangguk Ahn, Hiesik Kim, Jungil Park

Abstract:

In this paper, we present a portable ammonia gas detector for performing the gas safety management efficiently. The display of the detector is separated from its body. The display module is received the data measured from the detector using ZigBee. The detector has a rechargeable li-ion battery which can be use for 11~12 hours, and a Bluetooth module for sending the data to the PC or the smart devices. The data are sent to the server and can access using the web browser or mobile application. The range of the detection concentration is 0~100ppm.

Keywords: ammonia, detector, gas, portable

Procedia PDF Downloads 413
24105 The Experience of Community-based Tourism in Yunguilla, Ecuador and Its Social-Cultural Impact

Authors: York Neudel

Abstract:

The phenomenon of tourism has been considered as tool to overcome cultural frontiers, to comprehend the other and to cope with mutual mistrust and suspicion. Well, that has been a myth, at least when it comes to mass-tourism. Other approaches, like community-based tourism, still are based on the idea of embracing the other in order to help or to understand the cultural difference. In 1997, two American NGOs incentivized a tourism-project in a community in the highlands of Ecuador, in order to protect the cloud forest from destructive exploitation of its own inhabitants. Nineteen years after that, I analyze in this investigation the interactions between the Ecuadorian hosts in the mestizo-community of Yunguilla and the foreign tourist in the quest for “authentic life” in the Ecuadorian cloud forest. As a sort of “contemporary pilgrim” the traveller tries to find authenticity in other times and places far away from their everyday life in Europe or North America. Therefore, tourists are guided by stereotypes and expectations that are produced by the touristic industry. The host, on the other hand, has to negotiate this pre-established imaginary. That generates a kind of theatre-play with front- and backstage in organic gardens, little fabrics and even private housing, since this alternative project offers to share the private space of the host with the tourist in the setting the community-based tourism. In order to protect their privacy, the community creates new hybrid spaces that oscillate between front- and backstages that culminates in a game of hide and seek – a phenomenon that promises interesting frictions for an anthropological case-study.

Keywords: Tourism, Authenticity, Community-based tourism, Ecuador, Yunguilla

Procedia PDF Downloads 279
24104 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 230
24103 A Non-Invasive Blood Glucose Monitoring System Using near-Infrared Spectroscopy with Remote Data Logging

Authors: Bodhayan Nandi, Shubhajit Roy Chowdhury

Abstract:

This paper presents the development of a portable blood glucose monitoring device based on Near-Infrared Spectroscopy. The system supports Internet connectivity through WiFi and uploads the time series data of glucose concentration of patients to a server. In addition, the server is given sufficient intelligence to predict the future pathophysiological state of a patient given the current and past pathophysiological data. This will enable to prognosticate the approaching critical condition of the patient much before the critical condition actually occurs.The server hosts web applications to allow authorized users to monitor the data remotely.

Keywords: non invasive, blood glucose concentration, microcontroller, IoT, application server, database server

Procedia PDF Downloads 208