Search results for: real time data processing
38593 Identification of Lipo-Alkaloids and Fatty Acids in Aconitum carmichaelii Using Liquid Chromatography–Mass Spectrometry and Gas Chromatography–Mass Spectrometry
Authors: Ying Liang, Na Li
Abstract:
Lipo-alkaloid is a kind of C19-norditerpenoid alkaloids existed in Aconitum species, which usually contains an aconitane skeleton and one or two fatty acid residues. The structures are very similar to that of diester-type alkaloids, which are considered as the main bioactive components in Aconitum carmichaelii. They have anti-inflammatory, anti-nociceptive, and anti-proliferative activities. So far, more than 200 lipo-alkaloids were reported from plants, semisynthesis, and biotransformations. In our research, by the combination of ultra-high performance liquid chromatography-quadruple-time of flight mass spectrometry (UHPLC-Q-TOF-MS) and an in-house database, 148 lipo-alkaloids were identified from A. carmichaelii, including 93 potential new compounds and 38 compounds with oxygenated fatty acid moieties. To our knowledge, this is the first time of the reporting of the oxygenated fatty acids as the side chains in naturally-occurring lipo-alkaloids. Considering the fatty acid residues in lipo-alkaloids should come from the free acids in the plant, the fatty acids and their relationship with lipo-alkaloids were further investigated by GC-MS and LC-MS. Among 17 fatty acids identified by GC-MS, 12 were detected as the side chains of lipo-alkaloids, which accounted for about 1/3 of total lipo-alkaloids, while these fatty acid residues were less than 1/4 of total fatty acid residues. And, total of 37 fatty acids were determined by UHPCL-Q-TOF-MS, including 18 oxidized fatty acids firstly identified from A. carmichaelii. These fatty acids were observed as the side chains of lipo-alkaloids. In addition, although over 140 lipo-alkaloids were identified, six lipo-alkaloids, 8-O-linoleoyl-14-benzoylmesaconine (1), 8-O-linoleoyl-14-benzoylaconine (2), 8-O-palmitoyl-14-benzoylmesaconine (3), 8-O-oleoyl-14-benzoylmesaconine (4), 8-O-pal-benzoylaconine (5), and 8-O-ole-Benzoylaconine (6), were found to be the main components, which accounted for over 90% content of total lipo-alkaloids. Therefore, using these six components as standards, a UHPLC-Triple Quadrupole-MS (UHPLC-QQQ-MS) approach was established to investigate the influence of processing on the contents of lipo-alkaloids. Although it was commonly supposed that the contents of lipo-alkaloids increased after processing, our research showed that no significant change was observed before and after processing. Using the same methods, the lipo-alkaloids in the lateral roots of A. carmichaelii and the roots of A. kusnezoffii were determined and quantified. The contents of lipo-alkaloids in A. kusnezoffii were close to that of the parent roots of A. carmichaelii, while the lateral roots had less lipo-alkaloids than the parent roots. This work was supported by Macao Science and Technology Development Fund (086/2013/A3 and 003/2016/A1).Keywords: Aconitum carmichaelii, fatty acids, GC-MS, LC-MS, lipo-alkaloids
Procedia PDF Downloads 29938592 Time to CT in Major Trauma in Coffs Harbour Health Campus - The Australian Rural Centre Experience
Authors: Thampi Rawther, Jack Cecire, Andrew Sutherland
Abstract:
Introduction: CT facilitates the diagnosis of potentially life-threatening injuries and facilitates early management. There is evidence that reduced CT acquisition time reduces mortality and length of hospital stay. Currently, there are variable recommendations for ideal timing. Indeed, the NHS standard contract for a major trauma service and STAG both recommend immediate access to CT within a maximum time of 60min and appropriate reporting within 60min of the scan. At Coffs Harbour Health Campus (CHHC), a CT radiographer is on site between 8am-11pm. Aim: To investigate the average time to CT at CHHC and assess for any significant relationship between time to CT and injury severity score (ISS) or time of triage. Method: All major trauma calls between Jan 2021-Oct 2021 were audited (N=87). Patients were excluded if they went from ED to the theatre. Time to CT is defined as the time between triage to the timestamp on the first CT image. Median and interquartile range was used as a measure of central tendency as the data was not normally distributed, and Chi-square test was used to determine association. Results: The median time to CT is 51.5min (IQR 40-74). We found no relationship between time to CT and ISS (P=0.18) and time of triage to time to CT (P=0.35). We compared this to other centres such as John Hunter Hospital and Gold Coast Hospital. We found that the median CT acquisition times were 76min (IQR 52-115) and 43min, respectively. Conclusion: This shows an avenue for improvement given 35% of CT’s were >30min. Furthermore, being proactive and aware of time to CT as an important factor to trauma management can be another avenue for improvement. Based on this, we will re-audit in 12-24months to assess if any improvement has been made.Keywords: imaging, rural surgery, trauma surgery, improvement
Procedia PDF Downloads 10338591 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency
Authors: Fanqiang Kong, Chending Bian
Abstract:
In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation
Procedia PDF Downloads 26138590 Artificial Intelligence for Traffic Signal Control and Data Collection
Authors: Reggie Chandra
Abstract:
Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal
Procedia PDF Downloads 16938589 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources
Authors: Jolly Puri, Shiv Prasad Yadav
Abstract:
Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.Keywords: multi-component DEA, fuzzy multi-component DEA, fuzzy resources, decision making units (DMUs)
Procedia PDF Downloads 40738588 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis
Procedia PDF Downloads 38638587 Interpreting Privacy Harms from a Non-Economic Perspective
Authors: Christopher Muhawe, Masooda Bashir
Abstract:
With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.Keywords: data breach and misuse, economic harms, privacy harms, psychological harms
Procedia PDF Downloads 19538586 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network
Authors: Jia Xin Low, Keng Wah Choo
Abstract:
This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification
Procedia PDF Downloads 34938585 A Comparative Assessment of Industrial Composites Using Thermography and Ultrasound
Authors: Mosab Alrashed, Wei Xu, Stephen Abineri, Yifan Zhao, Jörn Mehnen
Abstract:
Thermographic inspection is a relatively new technique for Non-Destructive Testing (NDT) which has been gathering increasing interest due to its relatively low cost hardware and extremely fast data acquisition properties. This technique is especially promising in the area of rapid automated damage detection and quantification. In collaboration with a major industry partner from the aerospace sector advanced thermography-based NDT software for impact damaged composites is introduced. The software is based on correlation analysis of time-temperature profiles in combination with an image enhancement process. The prototype software is aiming to a) better visualise the damages in a relatively easy-to-use way and b) automatically and quantitatively measure the properties of the degradation. Knowing that degradation properties play an important role in the identification of degradation types, tests and results on specimens which were artificially damaged have been performed and analyzed.Keywords: NDT, correlation analysis, image processing, damage, inspection
Procedia PDF Downloads 54738584 Use of Life Cycle Data for State-Oriented Maintenance
Authors: Maximilian Winkens, Matthias Goerke
Abstract:
The state-oriented maintenance enables the preventive intervention before the failure of a component and guarantees avoidance of expensive breakdowns. Because the timing of the maintenance is defined by the component’s state, the remaining service life can be exhausted to the limit. The basic requirement for the state-oriented maintenance is the ability to define the component’s state. New potential for this is offered by gentelligent components. They are developed at the Corporative Research Centre 653 of the German Research Foundation (DFG). Because of their sensory ability they enable the registration of stresses during the component’s use. The data is gathered and evaluated. The methodology developed determines the current state of the gentelligent component based on the gathered data. This article presents this methodology as well as current research. The main focus of the current scientific work is to improve the quality of the state determination based on the life-cycle data analysis. The methodology developed until now evaluates the data of the usage phase and based on it predicts the timing of the gentelligent component’s failure. The real failure timing though, deviate from the predicted one because the effects from the production phase aren’t considered. The goal of the current research is to develop a methodology for state determination which considers both production and usage data.Keywords: state-oriented maintenance, life-cycle data, gentelligent component, preventive intervention
Procedia PDF Downloads 49538583 Impact of Social Media on Content of Saudi Television News Networks
Authors: Majed Alshaibani
Abstract:
Social media has emerged as a serious contender to TV news networks in Saudi Arabia. The growing usage of social media as a source of news and information has led to significant impact on the content presented by the news networks in Saudi Arabia. This study explored the various ways in which social media has influenced content aired on Saudi news networks. Data were collected by using semi structured interviews with 13 journalists and content editors working for four Saudi TV news networks and six senior academic experts on TV and media teaching in Saudi universities. The findings of the study revealed that social media has affected four aspects of the content on Saudi TV news networks. As a result the content aired on Saudi news networks is more neutral, real time, diverse in terms of sources and includes content on broader subjects and from different parts of the world. This research concludes that social media has contributed positively and significantly to improving the content on Saudi TV news networks.Keywords: TV news networks, Saudi Arabia, social media, media content
Procedia PDF Downloads 23838582 Meanings and Concepts of Standardization in Systems Medicine
Authors: Imme Petersen, Wiebke Sick, Regine Kollek
Abstract:
In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.Keywords: data, science and technology studies (STS), standardization, systems medicine
Procedia PDF Downloads 34138581 Intelligent Grading System of Apple Using Neural Network Arbitration
Authors: Ebenezer Obaloluwa Olaniyi
Abstract:
In this paper, an intelligent system has been designed to grade apple based on either its defective or healthy for production in food processing. This paper is segmented into two different phase. In the first phase, the image processing techniques were employed to extract the necessary features required in the apple. These techniques include grayscale conversion, segmentation where a threshold value is chosen to separate the foreground of the images from the background. Then edge detection was also employed to bring out the features in the images. These extracted features were then fed into the neural network in the second phase of the paper. The second phase is a classification phase where neural network employed to classify the defective apple from the healthy apple. In this phase, the network was trained with back propagation and tested with feed forward network. The recognition rate obtained from our system shows that our system is more accurate and faster as compared with previous work.Keywords: image processing, neural network, apple, intelligent system
Procedia PDF Downloads 39838580 Real-Time Nonintrusive Heart Rate Measurement: Comparative Case Study of LED Sensorics' Accuracy and Benefits in Heart Monitoring
Authors: Goran Begović
Abstract:
In recent years, many researchers are focusing on non-intrusive measuring methods when it comes to human biosignals. These methods provide solutions for everyday use, whether it’s health monitoring or finessing the workout routine. One of the biggest issues with these solutions is that the sensors’ accuracy is highly variable due to many factors, such as ambiental light, skin color diversity, etc. That is why we wanted to explore different outcomes under those kinds of circumstances in order to find the most optimal algorithm(s) for extracting heart rate (HR) information. The optimization of such algorithms can benefit the wider, cheaper, and safer application of home health monitoring, without having to visit medical professionals as often when it comes to observing heart irregularities. In this study, we explored the accuracy of infrared (IR), red, and green LED sensorics in a controlled environment and compared the results with a medically accurate ECG monitoring device.Keywords: data science, ECG, heart rate, holter monitor, LED sensors
Procedia PDF Downloads 12738579 Designing and Implementation of MPLS Based VPN
Authors: Muhammad Kamran Asif
Abstract:
MPLS stands for Multi-Protocol Label Switching. It is the technology which replaces ATM (Asynchronous Transfer Mode) and frame relay. In this paper, we have designed a full fledge small scale MPLS based service provider network core network model, which provides communication services (e.g. voice, video and data) to the customer more efficiently using label switching technique. Using MPLS VPN provides security to the customers which are either on LAN or WAN. It protects its single customer sites from being attacked by any intruder from outside world along with the provision of concept of extension of a private network over an internet. In this paper, we tried to implement a service provider network using minimum available resources i.e. five 3800 series CISCO routers comprises of service provider core, provider edge routers and customer edge routers. The customers on the one end of the network (customer side) is capable of sending any kind of data to the customers at the other end using service provider cloud which is MPLS VPN enabled. We have also done simulation and emulation for the model using GNS3 (Graphical Network Simulator-3) and achieved the real time scenarios. We have also deployed a NMS system which monitors our service provider cloud and generates alarm in case of any intrusion or malfunctioning in the network. Moreover, we have also provided a video help desk facility between customers and service provider cloud to resolve the network issues more effectively.Keywords: MPLS, VPN, NMS, ATM, asynchronous transfer mode
Procedia PDF Downloads 33138578 A Simulation Tool for Projection Mapping Based on Mapbox and Unity
Authors: Noriko Hanakawa, Masaki Obana
Abstract:
A simulation tool has been proposed for big-scale projection mapping events. The tool has four main functions based on Mapbox and Unity utilities. The first function is building a 3D model of real cities by MapBox. The second function is a movie projection to some buildings in real cities by Unity. The third function is a movie sending function from a PC to a virtual projector. The fourth function is mapping movies with fitting buildings. The simulation tool was adapted to a real projection mapping event that was held in 2019. The event has been finished. The event had a serious problem in the movie projection to the target building. The extra tents were set in front of the target building. The tents became the obstacles to the movie projection. The simulation tool can be reappeared the problems of the event. Therefore, if the simulation tool was developed before the 2019 projection mapping event, the problem of the tents’ obstacles could be avoided with the simulation tool. In addition, we confirmed that the simulation tool is useful to make a plan of future projection mapping events in order to avoid obstacles of various extra equipment such as utility poles, planting trees, monument towers.Keywords: projection mapping, projector position, real 3D map, avoiding obstacles
Procedia PDF Downloads 20338577 Robotic Arm Control with Neural Networks Using Genetic Algorithm Optimization Approach
Authors: Arbnor Pajaziti, Hasan Cana
Abstract:
In this paper, the structural genetic algorithm is used to optimize the neural network to control the joint movements of robotic arm. The robotic arm has also been modeled in 3D and simulated in real-time in MATLAB. It is found that Neural Networks provide a simple and effective way to control the robot tasks. Computer simulation examples are given to illustrate the significance of this method. By combining Genetic Algorithm optimization method and Neural Networks for the given robotic arm with 5 D.O.F. the obtained the results shown that the base joint movements overshooting time without controller was about 0.5 seconds, while with Neural Network controller (optimized with Genetic Algorithm) was about 0.2 seconds, and the population size of 150 gave best results.Keywords: robotic arm, neural network, genetic algorithm, optimization
Procedia PDF Downloads 52338576 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field
Authors: Nastaran Moosavi, Mohammad Mokhtari
Abstract:
Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.Keywords: density, p-impedance, s-impedance, post-stack seismic inversion, pre-stack seismic inversion
Procedia PDF Downloads 32438575 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 26338574 Nutritional Potential and Functionality of Whey Powder Influenced by Different Processing Temperature and Storage
Authors: Zarmina Gillani, Nuzhat Huma, Aysha Sameen, Mulazim Hussain Bukhari
Abstract:
Whey is an excellent food ingredient owing to its high nutritive value and its functional properties. However, composition of whey varies depending on composition of milk, processing conditions, processing method, and its whey protein content. The aim of this study was to prepare a whey powder from raw whey and to determine the influence of different processing temperatures (160 and 180 °C) on the physicochemical, functional properties during storage of 180 days and on whey protein denaturation. Results have shown that temperature significantly (P < 0.05) affects the pH, acidity, non-protein nitrogen (NPN), protein total soluble solids, fat and lactose contents. Significantly (p < 0.05) higher foaming capacity (FC), foam stability (FS), whey protein nitrogen index (WPNI), and a lower turbidity and solubility index (SI) were observed in whey powder processed at 160 °C compared to whey powder processed at 180 °C. During storage of 180 days, slow but progressive changes were noticed on the physicochemical and functional properties of whey powder. Reverse phase-HPLC analysis revealed a significant (P < 0.05) effect of temperature on whey protein contents. Denaturation of β-Lactoglobulin is followed by α-lacalbumin, casein glycomacropeptide (CMP/GMP), and bovine serum albumin (BSA).Keywords: whey powder, temperature, denaturation, reverse phase, HPLC
Procedia PDF Downloads 29938573 Select Communicative Approaches and Speaking Skills of Junior High School Students
Authors: Sonia Arradaza-Pajaron
Abstract:
Speaking English, as a medium of instruction among students who are non-native English speakers poses a real challenge to achieve proficiency, especially so if it is a requirement in most communicative classroom instruction. It becomes a real burden among students whose English language orientation is not well facilitated and encouraged by teachers among national high schools. This study, which utilized a descriptive-correlational research, examined the relationship between the select communicative approaches commonly utilized in classroom instruction to the level of speaking skills among the identified high school students. Survey questionnaires, interview, and observations sheets were researcher instruments used to generate salient information. Data were analyzed and treated statistically utilizing weighted mean speaking skills levels and Pearson r to determine the relationship between the two identified variables of the study. Findings revealed that the level of English speaking skills of the high school students is just average. Further, among the identified speaking sub-skills, namely, grammar, pronunciation and fluency, the students were considered above average level. There was also a clear relationship of some communicative approaches to the respondents’ speaking skills. Most notable among the select approaches is that of role-playing, compared to storytelling, informal debate, brainstorming, oral reporting, and others. It may be because role-playing is the most commonly used approach in the classroom. This implies that when these high school students are given enough time and autonomy on how they could express their ideas or comprehension of some lessons, they are shown to have a spontaneous manner of expression, through the maximization of the second language. It can be concluded further that high school students have the capacity to express ideas even in the second language, only if they are encouraged and well-facilitated by teachers. Also, when a better communicative approach is identified and better implemented, thus, will level up students’ classroom engagement.Keywords: communicative approaches, comprehension, role playing, speaking skills
Procedia PDF Downloads 17838572 Constructing the Density of States from the Parallel Wang Landau Algorithm Overlapping Data
Authors: Arman S. Kussainov, Altynbek K. Beisekov
Abstract:
This work focuses on building an efficient universal procedure to construct a single density of states from the multiple pieces of data provided by the parallel implementation of the Wang Landau Monte Carlo based algorithm. The Ising and Pott models were used as the examples of the two-dimensional spin lattices to construct their densities of states. Sampled energy space was distributed between the individual walkers with certain overlaps. This was made to include the latest development of the algorithm as the density of states replica exchange technique. Several factors of immediate importance for the seamless stitching process have being considered. These include but not limited to the speed and universality of the initial parallel algorithm implementation as well as the data post-processing to produce the expected smooth density of states.Keywords: density of states, Monte Carlo, parallel algorithm, Wang Landau algorithm
Procedia PDF Downloads 41238571 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics
Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir
Abstract:
Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone
Procedia PDF Downloads 19338570 An Approach to Practical Determination of Fair Premium Rates in Crop Hail Insurance Using Short-Term Insurance Data
Authors: Necati Içer
Abstract:
Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major difficulty in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.Keywords: crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters
Procedia PDF Downloads 5538569 For Post-traumatic Stress Disorder Counselors in China, the United States, and around the Globe, Cultural Beliefs Offer Challenges and Opportunities
Authors: Anne Giles
Abstract:
Trauma is generally defined as an experience, or multiple experiences, overwhelming a person's ability to cope. Over time, many people recover from the neurobiological, physical, and emotional effects of trauma on their own. For some people, however, troubling symptoms develop over time that can result in distress and disability. This cluster of symptoms is classified as Post-traumatic Stress Disorder (PTSD). People who meet the criteria for PTSD and other trauma-related disorder diagnoses often hold a set of understandable but unfounded beliefs about traumatic events that cause undue suffering. Becoming aware of unhelpful beliefs—termed "cognitive distortions"—and challenging them is the realm of Cognitive Behavior Therapy (CBT). A form of CBT found by researchers to be especially effective for PTSD is Cognitive Processing Therapy (CPT). Through the compassionate use of CPT, people identify, examine, challenge, and relinquish unhelpful beliefs, thereby reducing symptoms and suffering. Widely-held cultural beliefs can interfere with the progress of recovery from trauma-related disorders. Although highly revered, largely unquestioned, and often stabilizing, cultural beliefs can be founded in simplistic, dichotomous thinking, i.e., things are all right, or all wrong, all good, or all bad. The reality, however, is nuanced and complex. After studying examples of cultural beliefs from China and the United States and how these might interfere with trauma recovery, trauma counselors can help clients derive criteria for preserving helpful beliefs, discover, examine, and jettison unhelpful beliefs, reduce trauma symptoms, and live their lives more freely and fully.Keywords: cognitive processing therapy (CPT), cultural beliefs, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 25038568 Numerical Resolving of Net Faradaic Current in Fast-Scan Cyclic Voltammetry Considering Induced Charging Currents
Authors: Gabriel Wosiak, Dyovani Coelho, Evaldo B. Carneiro-Neto, Ernesto C. Pereira, Mauro C. Lopes
Abstract:
In this work, the theoretical and experimental effects of induced charging currents on fast-scan cyclic voltammetry (FSCV) are investigated. Induced charging currents arise from the effect of ohmic drop in electrochemical systems, which depends on the presence of an uncompensated resistance. They cause the capacitive contribution to the total current to be different from the capacitive current measured in the absence of electroactive species. The paper shows that the induced charging current is relevant when the capacitive current magnitude is close to the total current, even for systems with low time constant. In these situations, the conventional background subtraction method may be inaccurate. A method is developed that separates the faradaic and capacitive currents by using a combination of voltametric experimental data and finite element simulation, by the obtention of a potential-dependent capacitance. The method was tested in a standard electrochemical cell with Platinum ultramicroelectrodes, in different experimental conditions as well in previously reported data in literature. The proposed method allows the real capacitive current to be separated even in situations where the conventional background subtraction method is clearly inappropriate.Keywords: capacitive current, fast-scan cyclic voltammetry, finite-element method, electroanalysis
Procedia PDF Downloads 7538567 High Efficient Biohydrogen Production from Cassava Starch Processing Wastewater by Two Stage Thermophilic Fermentation and Electrohydrogenesis
Authors: Peerawat Khongkliang, Prawit Kongjan, Tsuyoshi Imai, Poonsuk Prasertsan, Sompong O-Thong
Abstract:
A two-stage thermophilic fermentation and electrohydrogenesis process was used to convert cassava starch processing wastewater into hydrogen gas. Maximum hydrogen yield from fermentation stage by Thermoanaerobacterium thermosaccharolyticum PSU-2 was 248 mL H2/g-COD at optimal pH of 6.5. Optimum hydrogen production rate of 820 mL/L/d and yield of 200 mL/g COD was obtained at HRT of 2 days in fermentation stage. Cassava starch processing wastewater fermentation effluent consisted of acetic acid, butyric acid and propionic acid. The effluent from fermentation stage was used as feedstock to generate hydrogen production by microbial electrolysis cell (MECs) at an applied voltage of 0.6 V in second stage with additional 657 mL H2/g-COD was produced. Energy efficiencies based on electricity needed for the MEC were 330 % with COD removals of 95 %. The overall hydrogen yield was 800-900 mL H2/g-COD. Microbial community analysis of electrohydrogenesis by DGGE shows that exoelectrogens belong to Acidiphilium sp., Geobacter sulfurreducens and Thermincola sp. were dominated at anode. These results show two-stage thermophilic fermentation, and electrohydrogenesis process improved hydrogen production performance with high hydrogen yields, high gas production rates and high COD removal efficiency.Keywords: cassava starch processing wastewater, biohydrogen, thermophilic fermentation, microbial electrolysis cell
Procedia PDF Downloads 34338566 The Use of Authentic Materials in the Chinese Language Classroom
Authors: Yiwen Jin, Jing Xiao, Pinfang Su
Abstract:
The idea of adapting authentic materials in language teaching is from the communicative method in the 1970s. Different from the language in language textbooks, authentic materials is not deliberately written, it is from the native speaker’s real life and contains real information, which can meet social needs. It could improve learners ' interest, create authentic context and improve learners ' communicative competence. Authentic materials play an important role in CFL(Chinese as a foreign language) classroom. Different types of authentic materials can be used in different ways during learning and teaching. Because of the COVID-19 pandemic,a lot of Chinese learners are learning Chinese without the real language environment. Although there are some well-written textbooks, there is a certain distance between textbook language materials and daily life. Learners cannot automatically fill this gap. That is why it is necessary to apply authentic materials as a supplement to the language textbook to create the real context. Chinese teachers around the world are working together, trying to integrate the resources and apply authentic materials through different approach. They apply authentic materials in the form of new textbooks, manuals, apps and short videos they collect and create to help Chinese learning and teaching. A review of previous research on authentic materials and the Chinese teachers’ attempt to adapt it in the classroom are offered in this manuscript.Keywords: authentic materials, Chinese as a second language, developmental use of digital resources, materials development for language teaching
Procedia PDF Downloads 17438565 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System
Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek
Abstract:
This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.Keywords: data warehouse, GIS, MCDM, SOLAP
Procedia PDF Downloads 17838564 Inversion of Electrical Resistivity Data: A Review
Authors: Shrey Sharma, Gunjan Kumar Verma
Abstract:
High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.Keywords: inversion, limitations, optimization, resistivity
Procedia PDF Downloads 365