Search results for: computer processing of large databases
10795 Kinematic Behavior of Geogrid Reinforcements during Earthquakes
Authors: Ahmed Hosny Abdel-Rahman, Mohamed Abdel-Moneim
Abstract:
Reinforced earth structures are generally subjected to cyclic loading generated from earthquakes. This paper presents a summary of the results and analyses of a testing program carried out in a large-scale multi-function geosynthetic testing apparatus that accommodates soil samples up to 1.0 m3. This apparatus performs different shear and pullout tests under both static and cyclic loading. The testing program was carried out to investigate the controlling factors affecting soil/geogrid interaction under cyclic loading. The extensibility of the geogrids, the applied normal stresses, the characteristics of the cyclic loading (frequency, and amplitude), and initial static load within the geogrid sheet were considered in the testing program. Based on the findings of the testing program, the effect of these parameters on the pullout resistance of geogrids, as well as the displacement mobility under cyclic loading were evaluated. Conclusions and recommendations for the design of reinforced earth walls under cyclic loading are presented.Keywords: geogrid, soil, interface, cyclic loading, pullout, large scale testing
Procedia PDF Downloads 62210794 Enhancing Seawater Desalination Efficiency with Combined Reverse Osmosis and Vibratory Shear-Enhanced Processing for Higher Conversion Rates and Reduced Energy Consumption
Authors: Reda Askouri, Mohamed Moussetad, Rhma Adhiri
Abstract:
Reverse osmosis (RO) is one of the most widely used techniques for seawater desalination. However, the conversion rate of this method is generally limited to 35-45% due to the high-pressure capacity of the membranes. Additionally, the specific energy consumption (SEC) for seawater desalination is high, necessitating energy recovery systems to minimise energy consumption. This study aims to enhance the performance of seawater desalination by combining RO with a vibratory shear-enhanced processing (VSEP) technique. The RO unit in this study comprises two stages, each powered by a hydraulic turbocharger that increases the pressure in both stages. The concentrate from the second stage is then directly processed by VSEP technology. The results demonstrate that the permeate water obtained exhibits high quality and that the conversion rate is significantly increased, reaching high percentages with low SEC. Furthermore, the high concentration of total solids in the concentrate allows for potential exploitation within the environmental protection framework. By valorising the concentrated waste, it’s possible to reduce the environmental impact while increasing the overall efficiency of the desalination process.Keywords: specific energy consumption, vibratory shear enhanced process, environmental challenge, water recovery
Procedia PDF Downloads 1210793 A Low Phase Noise CMOS LC Oscillator with Tail Current-Shaping
Authors: Amir Mahdavi
Abstract:
In this paper, a circuit topology of voltage-controlled oscillators (VCO) which is suitable for ultra-low-phase noise operations is introduced. To do so, a new low phase noise cross-coupled oscillator by using the general topology of cross-coupled oscillator and adding a differential stage for tail current shaping is designed. In addition, a tail current shaping technique to improve phase noise in differential LC VCOs is presented. The tail current becomes large when the oscillator output voltage arrives at the maximum or minimum value and when the sensitivity of the output phase to the noise is the smallest. Also, the tail current becomes small when the phase noise sensitivity is large. The proposed circuit does not use extra power and extra noisy active devices. Furthermore, this topology occupies small area. Simulation results show the improvement in phase noise by 2.5dB under the same conditions and at the carrier frequency of 1 GHz for GSM applications. The power consumption of the proposed circuit is 2.44 mW and the figure of merit (FOM) with -192.2 dBc/Hz is achieved for the new oscillator.Keywords: LC oscillator, low phase noise, current shaping, diff mode
Procedia PDF Downloads 60010792 Efficacy of Computer Mediated Power Point Presentations on Students' Learning Outcomes in Basic Science in Oyo State, Nigeria
Authors: Sunmaila Oyetunji Raimi, Olufemi Akinloye Bolaji, Abiodun Ezekiel Adesina
Abstract:
The lingering poor performance of students in basic science spells doom for a vibrant scientific and technological development which pivoted the economic, social and physical upliftment of any nation. This calls for identifying appropriate strategies for imparting basic science knowledge and attitudes to the teaming youths in secondary schools. This study, therefore, determined the impact of computer mediated power point presentations on students’ achievement in basic science in Oyo State, Nigeria. A pre-test, posttest, control group quazi-experimental design adopted for the study. Two hundred and five junior secondary two students selected using stratified random sampling technique participated in the study. Three research questions and three hypotheses guided the study. Two evaluative instruments – Students’ Basic Science Attitudes Scale (SBSAS, r = 0.91); Students’ Knowledge of Basic Science Test (SKBST, r = 0.82) were used for data collection. Descriptive statistics of mean, standard deviation and inferential statistics of ANCOVA, scheffe post-hoc test were used to analyse the data. The results indicated significant main effect of treatment on students cognitive (F(1,200)= 171.680; p < 0.05) and attitudinal (F(1,200)= 34.466; p < 0.05) achievement in Basic science with the experimental group having higher mean gain than the control group. Gender has significant main effect (F(1,200)= 23.382; p < 0.05) on students cognitive outcomes but not significant for attitudinal achievement in Basic science. The study therefore recommended among others that computer mediated power point presentations should be incorporated into curriculum methodology of Basic science in secondary schools.Keywords: basic science, computer mediated power point presentations, gender, students’ achievement
Procedia PDF Downloads 42910791 Effect of Curing Temperature on the Textural and Rheological of Gelatine-SDS Hydrogels
Authors: Virginia Martin Torrejon, Binjie Wu
Abstract:
Gelatine is a protein biopolymer obtained from the partial hydrolysis of animal tissues which contain collagen, the primary structural component in connective tissue. Gelatine hydrogels have attracted considerable research in recent years as an alternative to synthetic materials due to their outstanding gelling properties, biocompatibility and compostability. Surfactants, such as sodium dodecyl sulfate (SDS), are often used in hydrogels solutions as surface modifiers or solubility enhancers, and their incorporation can influence the hydrogel’s viscoelastic properties and, in turn, its processing and applications. Literature usually focuses on studying the impact of formulation parameters (e.g., gelatine content, gelatine strength, additives incorporation) on gelatine hydrogels properties, but processing parameters, such as curing temperature, are commonly overlooked. For example, some authors have reported a decrease in gel strength at lower curing temperatures, but there is a lack of research on systematic viscoelastic characterisation of high strength gelatine and gelatine-SDS systems at a wide range of curing temperatures. This knowledge is essential to meet and adjust the technological requirements for different applications (e.g., viscosity, setting time, gel strength or melting/gelling temperature). This work investigated the effect of curing temperature (10, 15, 20, 23 and 25 and 30°C) on the elastic modulus (G’) and melting temperature of high strength gelatine-SDS hydrogels, at 10 wt% and 20 wt% gelatine contents, by small-amplitude oscillatory shear rheology coupled with Fourier Transform Infrared Spectroscopy. It also correlates the gel strength obtained by rheological measurements with the gel strength measured by texture analysis. Gelatine and gelatine-SDS hydrogels’ rheological behaviour strongly depended on the curing temperature, and its gel strength and melting temperature can be slightly modified to adjust it to given processing and applications needs. Lower curing temperatures led to gelatine and gelatine-SDS hydrogels with considerably higher storage modulus. However, their melting temperature was lower than those gels cured at higher temperatures and lower gel strength. This effect was more considerable at longer timescales. This behaviour is attributed to the development of thermal-resistant structures in the lower strength gels cured at higher temperatures.Keywords: gelatine gelation kinetics, gelatine-SDS interactions, gelatine-surfactant hydrogels, melting and gelling temperature of gelatine gels, rheology of gelatine hydrogels
Procedia PDF Downloads 10110790 Formal Verification of Cache System Using a Novel Cache Memory Model
Authors: Guowei Hou, Lixin Yu, Wei Zhuang, Hui Qin, Xue Yang
Abstract:
Formal verification is proposed to ensure the correctness of the design and make functional verification more efficient. As cache plays a vital role in the design of System on Chip (SoC), and cache with Memory Management Unit (MMU) and cache memory unit makes the state space too large for simulation to verify, then a formal verification is presented for such system design. In the paper, a formal model checking verification flow is suggested and a new cache memory model which is called “exhaustive search model” is proposed. Instead of using large size ram to denote the whole cache memory, exhaustive search model employs just two cache blocks. For cache system contains data cache (Dcache) and instruction cache (Icache), Dcache memory model and Icache memory model are established separately using the same mechanism. At last, the novel model is employed to the verification of a cache which is module of a custom-built SoC system that has been applied in practical, and the result shows that the cache system is verified correctly using the exhaustive search model, and it makes the verification much more manageable and flexible.Keywords: cache system, formal verification, novel model, system on chip (SoC)
Procedia PDF Downloads 49610789 Empirical Investigation of Gender Differences in Information Processing Style, Tinkering, and Self-Efficacy for Robot Tele-Operation
Authors: Dilruba Showkat, Cindy Grimm
Abstract:
As robots become more ubiquitous, it is significant for us to understand how different groups of people respond to possible ways of interacting with the robot. In this study, we focused on gender differences while users were tele-operating a humanoid robot that was physically co-located with them. We investigated three factors during the human-robot interaction (1) information processing strategy (2) self-efficacy and (3) tinkering or exploratory behavior. The experimental results show that the information on how to use the robot was processed comprehensively by the female participants whereas males processed them selectively (p < 0.001). Males were more confident when using the robot than females (p = 0.0002). Males tinkered more with the robot than females (p = 0.0021). We found that tinkering was positively correlated (p = 0.0068) with task success and negatively correlated (p = 0.0032) with task completion time. Tinkering might have resulted in greater task success and lower task completion time for males. Findings from this research can be used for making design decisions for robots and open new research directions. Our results show the importance of accounting for gender differences when developing interfaces for interacting with robots and open new research directions.Keywords: humanoid robots, tele-operation, gender differences, human-robot interaction
Procedia PDF Downloads 16710788 Electronic and Computer-Assisted Refreshable Braille Display Developed for Visually Impaired Individuals
Authors: Ayşe Eldem, Fatih Başçiftçi
Abstract:
Braille alphabet is an important tool that enables visually impaired individuals to have a comfortable life like those who have normal vision. For this reason, new applications related to the Braille alphabet are being developed. In this study, a new Refreshable Braille Display was developed to help visually impaired individuals learn the Braille alphabet easier. By means of this system, any text downloaded on a computer can be read by the visually impaired individual at that moment by feeling it by his/her hands. Through this electronic device, it was aimed to make learning the Braille alphabet easier for visually impaired individuals with whom the necessary tests were conducted.Keywords: visually impaired individual, Braille, Braille display, refreshable Braille display, USB
Procedia PDF Downloads 34510787 A Novel Computer-Generated Hologram (CGH) Achieved Scheme Generated from Point Cloud by Using a Lens Array
Authors: Wei-Na Li, Mei-Lan Piao, Nam Kim
Abstract:
We proposed a novel computer-generated hologram (CGH) achieved scheme, wherein the CGH is generated from a point cloud which is transformed by a mapping relationship of a series of elemental images captured from a real three-dimensional (3D) object by using a lens array. This scheme is composed of three procedures: mapping from elemental images to point cloud, hologram generation, and hologram display. A mapping method is figured out to achieve a virtual volume date (point cloud) from a series of elemental images. This mapping method consists of two steps. Firstly, the coordinate (x, y) pairs and its appearing number are calculated from the series of sub-images, which are generated from the elemental images. Secondly, a series of corresponding coordinates (x, y, z) are calculated from the elemental images. Then a hologram is generated from the volume data that is calculated by the previous two steps. Eventually, a spatial light modulator (SLM) and a green laser beam are utilized to display this hologram and reconstruct the original 3D object. In this paper, in order to show a more auto stereoscopic display of a real 3D object, we successfully obtained the actual depth data of every discrete point of the real 3D object, and overcame the inherent drawbacks of the depth camera by obtaining point cloud from the elemental images.Keywords: elemental image, point cloud, computer-generated hologram (CGH), autostereoscopic display
Procedia PDF Downloads 58410786 Hindi Speech Synthesis by Concatenation of Recognized Hand Written Devnagri Script Using Support Vector Machines Classifier
Authors: Saurabh Farkya, Govinda Surampudi
Abstract:
Optical Character Recognition is one of the current major research areas. This paper is focussed on recognition of Devanagari script and its sound generation. This Paper consists of two parts. First, Optical Character Recognition of Devnagari handwritten Script. Second, speech synthesis of the recognized text. This paper shows an implementation of support vector machines for the purpose of Devnagari Script recognition. The Support Vector Machines was trained with Multi Domain features; Transform Domain and Spatial Domain or Structural Domain feature. Transform Domain includes the wavelet feature of the character. Structural Domain consists of Distance Profile feature and Gradient feature. The Segmentation of the text document has been done in 3 levels-Line Segmentation, Word Segmentation, and Character Segmentation. The pre-processing of the characters has been done with the help of various Morphological operations-Otsu's Algorithm, Erosion, Dilation, Filtration and Thinning techniques. The Algorithm was tested on the self-prepared database, a collection of various handwriting. Further, Unicode was used to convert recognized Devnagari text into understandable computer document. The document so obtained is an array of codes which was used to generate digitized text and to synthesize Hindi speech. Phonemes from the self-prepared database were used to generate the speech of the scanned document using concatenation technique.Keywords: Character Recognition (OCR), Text to Speech (TTS), Support Vector Machines (SVM), Library of Support Vector Machines (LIBSVM)
Procedia PDF Downloads 49910785 A Study Problem and Needs Compare the Held of the Garment Industries in Nonthaburi and Bangkok Area
Authors: Thepnarintra Praphanphat
Abstract:
The purposes of this study were to investigate garment industry’s condition, problems, and need for assistance. The population of the study was 504 managers or managing directors of garment establishments finished apparel industrial manager and permission of the Department of Industrial Works 28, Ministry of Industry until January 1, 2012. In determining the sample size with the opening of the Taro Yamane finished at 95% confidence level is ± 5% deviation was 224 managers. Questionnaires were used to collect the data. Percentage, frequency, arithmetic mean, standard deviation, t-test, ANOVA, and LSD were used to analyze the data. It was found that most establishments were of a large size, operated in a form of limited company for more than 15 years most of which produced garments for working women. All investment was made by Thai people. The products were made to order and distributed domestically and internationally. The total sale of the year 2010, 2011, and 2012 was almost the same. With respect to the problems of operating the business, the study indicated, as a whole, by- aspects, and by-items, that they were at a high level. The comparison of the level of problems of operating garment business as classified by general condition showed that problems occurring in business of different sizes were, as a whole, not different. In taking aspects into consideration, it was found that the level of problem in relation to production was different; medium establishments had more problems in production than those of small and large sizes. According to the by-items analysis, five problems were found different; namely, problems concerning employees, machine maintenance, number of designers, and price competition. Such problems in the medium establishments were at a higher level than those in the small and large establishments. Regarding business age, the examination yielded no differences as a whole, by-aspects, and by-items. The statistical significance level of this study was set at .05.Keywords: garment industry, garment, fashion, competitive enhancement project
Procedia PDF Downloads 18710784 Economic Analysis of Coffee Cultivation in Kodagu District of Karnataka State, India
Authors: P. S. Dhananjaya Swamy, B. Chinnappa, G. B. Ramesh, Naveen P. Kumar
Abstract:
Kodagu district is one of the most densely forested districts in the India as around sixty five per cent of geographical areas under tree cover. Nearly 53 per cent of the flora of Kodagu is endemic. The district is also a hotspot of endemic orchids found mainly in the Thadiandamol. Shade grown, eco-friendly coffee farms are perhaps a selected few places on this planet where nature runs wild. The Kodagu accounts for more than 8.8 per cent of floral diversity of Karnataka state. Estimation of unit cost of cultivation plays a vital role in determining the governmental program their market intervention policies. On an average, planters incurred around Rs. 17041 per acre. The extent of production risk was highest among small category of planters (66 %) compared to other two exhibiting production instability. The result shows that, the coffee productivity in medium plantations was 1051.2 kg per acre as against 758.5 and 789.2 kg in the case of small and large plantations. An annual net return per acre was highest in the case of medium planters (Rs. 26109.3) as against Rs. 20566.7 and Rs. 18572.7 in the case of small and large planters. Cost of production was lowest in the case of small planters (Rs. 18.9 per kg of output) followed by medium planters (Rs. 21.2 per kg of output) and large planters (Rs. 22.5 per kg of output). The productivity of coffee is less whenever it is grown under high shade and native tree cover; it is around 6 quintals per acre when compared with low shade conditions, which is around 8.9 quintals per acre, without a significant difference in the amount invested for growing coffee. Net gain was lower by Rs. 15.5 per kg for the planters growing under high shade and native trees cover when compared with low shade and exotic trees cover.Keywords: coffee, cultivation, economics, Kodagu
Procedia PDF Downloads 19610783 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework
Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim
Abstract:
Background modeling and subtraction in video analysis has been widely proved to be an effective method for moving objects detection in many computer vision applications. Over the past years, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are two of the most frequently occurring issues in the practical situation. This paper presents a new two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean values of RGB color channels. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block-wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the outputs of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate a very competitive performance compared to previous models.Keywords: background subtraction, codebook model, local binary pattern, dynamic background, illumination change
Procedia PDF Downloads 21710782 Effect of Different Processing Methods on the Proximate, Functional, Sensory, and Nutritional Properties of Weaning Foods Formulated from Maize (Zea mays) and Soybean (Glycine max) Flour Blends
Authors: C. O. Agu, C. C. Okafor
Abstract:
Maize and soybean flours were produced using different methods of processing which include fermentation (FWF), roasting (RWF) and malting (MWF). Products from the different methods were mixed in the ratio 60:40 maize/soybean, respectively. These composites mixed with other ingredients such as sugar, vegetable oil, vanilla flavour and vitamin mix were analyzed for proximate composition, physical/functional, sensory and nutritional properties. The results for the protein content ranged between 6.25% and 16.65% with sample RWF having the highest value. Crude fibre values ranged from 3.72 to 10.0%, carbohydrate from 58.98% to 64.2%, ash from 1.27 to 2.45%. Physical and functional properties such as bulk density, wettability, gelation capacity have values between 0.74 and 0.76g/ml, 20.33 and 46.33 min and 0.73 to 0.93g/ml, respectively. On the sensory quality colour, flavour, taste, texture and general acceptability were determined. In terms of colour and flavour there was no significant difference (P < 0.05) while the values for taste ranged between 4.89 and 7.1 l, texture 5.50 to 8.38 and general acceptability 6.09 and 7.89. Nutritionally there is no significant difference (P < 0.05) between sample RWF and the control in all parameters considered. Samples FWF and MWF showed significantly (P < 0.5) lower values in all parameters determined. In the light of the above findings, roasting method is highly recommend in the production of weaning foods.Keywords: fermentation, malting, ratio, roasting, wettability
Procedia PDF Downloads 30410781 Processing of Input Material as a Way to Improve the Efficiency of the Glass Production Process
Authors: Joanna Rybicka-Łada, Magda Kosmal, Anna Kuśnierz
Abstract:
One of the main problems of the glass industry is the still high consumption of energy needed to produce glass mass, as well as the increase in prices, fuels, and raw materials. Therefore, comprehensive actions are taken to improve the entire production process. The key element of these activities, starting from filling the set to receiving the finished product, is the melting process, whose task is, among others, dissolving the components of the set, removing bubbles from the resulting melt, and obtaining a chemically homogeneous glass melt. This solution avoids dust formation during filling and is available on the market. This process consumes over 90% of the total energy needed in the production process. The processes occurring in the set during its conversion have a significant impact on the further stages and speed of the melting process and, thus, on its overall effectiveness. The speed of the reactions occurring and their course depend on the chemical nature of the raw materials, the degree of their fragmentation, thermal treatment as well as the form of the introduced set. An opportunity to minimize segregation and accelerate the conversion of glass sets may be the development of new technologies for preparing and dosing sets. The previously preferred traditional method of melting the set, based on mixing all glass raw materials together in loose form, can be replaced with a set in a thickened form. The aim of the project was to develop a glass set in a selectively or completely densified form and to examine the influence of set processing on the melting process and the properties of the glass.Keywords: glass, melting process, glass set, raw materials
Procedia PDF Downloads 6010780 Large Neural Networks Learning From Scratch With Very Few Data and Without Explicit Regularization
Authors: Christoph Linse, Thomas Martinetz
Abstract:
Recent findings have shown that Neural Networks generalize also in over-parametrized regimes with zero training error. This is surprising, since it is completely against traditional machine learning wisdom. In our empirical study we fortify these findings in the domain of fine-grained image classification. We show that very large Convolutional Neural Networks with millions of weights do learn with only a handful of training samples and without image augmentation, explicit regularization or pretraining. We train the architectures ResNet018, ResNet101 and VGG19 on subsets of the difficult benchmark datasets Caltech101, CUB_200_2011, FGVCAircraft, Flowers102 and StanfordCars with 100 classes and more, perform a comprehensive comparative study and draw implications for the practical application of CNNs. Finally, we show that VGG19 with 140 million weights learns to distinguish airplanes and motorbikes with up to 95% accuracy using only 20 training samples per class.Keywords: convolutional neural networks, fine-grained image classification, generalization, image recognition, over-parameterized, small data sets
Procedia PDF Downloads 8810779 Importance of Developing a Decision Support System for Diagnosis of Glaucoma
Authors: Murat Durucu
Abstract:
Glaucoma is a condition of irreversible blindness, early diagnosis and appropriate interventions to make the patients able to see longer time. In this study, it addressed that the importance of developing a decision support system for glaucoma diagnosis. Glaucoma occurs when pressure happens around the eyes it causes some damage to the optic nerves and deterioration of vision. There are different levels ranging blindness of glaucoma disease. The diagnosis at an early stage allows a chance for therapies that slows the progression of the disease. In recent years, imaging technology from Heidelberg Retinal Tomography (HRT), Stereoscopic Disc Photo (SDP) and Optical Coherence Tomography (OCT) have been used for the diagnosis of glaucoma. This better accuracy and faster imaging techniques in response technique of OCT have become the most common method used by experts. Although OCT images or HRT precision and quickness, especially in the early stages, there are still difficulties and mistakes are occurred in diagnosis of glaucoma. It is difficult to obtain objective results on diagnosis and placement process of the doctor's. It seems very important to develop an objective decision support system for diagnosis and level the glaucoma disease for patients. By using OCT images and pattern recognition systems, it is possible to develop a support system for doctors to make their decisions on glaucoma. Thus, in this recent study, we develop an evaluation and support system to the usage of doctors. Pattern recognition system based computer software would help the doctors to make an objective evaluation for their patients. It is intended that after development and evaluation processes of the software, the system is planning to be serve for the usage of doctors in different hospitals.Keywords: decision support system, glaucoma, image processing, pattern recognition
Procedia PDF Downloads 30210778 Ways for University to Conduct Research Evaluation: Based on National Research University Higher School of Economics Example
Authors: Svetlana Petrikova, Alexander Yu Kostinskiy
Abstract:
Management of research evaluation in the Higher School of Economics (HSE) originates from the HSE Academic Fund created in 2004 to facilitate and support academic research and presents its results to international academic community. As the means to inspire the applicants, science projects went through competitive selection process evaluated by the group of experts. Drastic development of HSE, quantity of applied projects for each Academic Fund competition and the need to coordinate the conduct of expert evaluation resulted in founding of the Office for Research Evaluation in 2013. The Office’s primary objective is management of research evaluation of science projects. The standards to conduct the evaluation are defined as follows: - The exercise of the process approach, the unification of the functioning of department. - The uniformity of regulatory, organizational and methodological framework. - The development of proper on-line evaluation system. - The broad involvement of external Russian and international experts, the renouncement of the usage of own employees. - The development of an algorithm to make a correspondence between experts and science projects. - The methodical usage of opened/closed international and Russian databases to extend the expert database. - The transparency of evaluation results – free access to assessment while keeping experts confidentiality. The management of research evaluation of projects is based on the sole standard, organization and financing. The standard way of conducting research evaluation at HSE is based upon Regulations on basic principles for research evaluation at HSE. These Regulations have been developed from the moment of establishment of the Office for Research Evaluation and are based on conventional corporate standards for regulatory document management. The management system of research evaluation is implemented on the process approach basis. Process approach means deployment of work as a process, which is the aggregation of interrelated and interacting activities processing inputs into outputs. Inputs are firstly client asking for the assessment to be conducted, defining the conditions for organizing and carrying of the assessment and secondly the applicant with proper for the competition application; output is assessment given to the client. While exercising process approach to clarify interrelation and interacting main parties or subjects of the assessment are determined and the way for interaction between them forms up. Parties to expert assessment are: - Ordering Party – The department of the university taking the decision to subject a project to expert assessment; - Providing Party – The department of the university authorized to provide such assessment by the Ordering Party; - Performing Party – The legal and natural entities that have expertise in the area of research evaluation. Experts assess projects in accordance with criteria and states of expert opinions approved by the Ordering Party. Objects of assessment generally are applications or HSE competition project reports. Mainly assessments are deployed for internal needs, i.e. the most ordering parties are HSE branches and departments, but assessment can also be conducted for external clients. The financing of research evaluation at HSE is based on the established corporate culture and traditions of HSE.Keywords: expert assessment, management of research evaluation, process approach, research evaluation
Procedia PDF Downloads 25310777 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement
Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova
Abstract:
One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank
Procedia PDF Downloads 38710776 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection
Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye
Abstract:
Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.Keywords: connected-component, projection-profile, segmentation, text-line
Procedia PDF Downloads 12410775 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling
Procedia PDF Downloads 17510774 Cyclic Response of Reinforced Concrete Beam-Column Joint Strengthening by FRP
Authors: N. Attari, S. Amziane, M. Chemrouk
Abstract:
A large number of old buildings have been identified as having potentially critical detailing to resist earthquakes. The main reinforcement of lap-spliced columns just above the joint region, discontinuous bottom beam reinforcement, and little or no joint transverse reinforcement are the most critical details of interior beam column joints in such buildings. This structural type constitutes a large share of the building stock, both in developed and developing countries, and hence it represents a substantial exposure. Direct observation of damaged structures, following the Algiers 2003 earthquake, has shown that damage occurs usually at the beam-column joints, with failure in bending or shear, depending on geometry and reinforcement distribution and type. While substantial literature exists for the design of concrete frame joints to withstand this type of failure, after the earthquake many structures were classified as slightly damaged and, being uneconomic to replace them, at least in the short term, suitable means of repairs of the beam column joint area are being studied. Furthermore; there exists a large number of buildings that need retrofitting of the joints before the next earthquake. The paper reports the results of the experimental programme, constituted of three beam-column reinforced concrete joints at a scale of one to three (1/3) tested under the effect of a pre-stressing axial load acting over the column. The beams were subjected at their ends to an alternate cyclic loading under displacement control to simulate a seismic action. Strain and cracking fields were monitored with the help a digital recording camera. Following the analysis of the results, a comparison can be made between the performances in terms of ductility, strength and mode of failure of the different strengthening solution considered.Keywords: fibre reinforced polymers, joints, reinforced concrete, beam columns
Procedia PDF Downloads 41710773 Supplier Relationship Management and Selection Strategies: A Literature Review
Authors: Priyesh Kumar Singh, S. K. Sharma, Sanjay Verma, C. Samuel
Abstract:
Supplier Relationship Management (SRM), is strategic planning and managing of all interactions with suppliers to maximize its value. Its application varies from construction industries to healthcare system and investment banks to aviation industries. Several buyer-supplier relationship models, as well as supplier selection and evaluation strategies, have been documented by many academicians and researchers. In this paper, through a comprehensive literature review of over 30 published papers, different theoretical models, empirical data and conclusions were analysed relating to SRM to find its role in establishing better supplier relationships. These journal articles were searched by using the keyword “supplier relationship management,” in databases of Mendeley Library, ProQuest, EBSCO and Google Scholar. This paper reviews the academic literature on different relationship models, supplier evaluation, and selection strategies to discuss its implications in different situations. It also describes the dominant factors responsible for buyer-supplier relationships such trust and power. Finally, conclusions have been drawn which can be validated by various researchers and can help practitioners in industries.Keywords: supplier relationship management, supplier performance, supplier evaluation, supplier selection strategies
Procedia PDF Downloads 28010772 Roasting Degree of Cocoa Beans by Artificial Neural Network (ANN) Based Electronic Nose System and Gas Chromatography (GC)
Authors: Juzhong Tan, William Kerr
Abstract:
Roasting is one critical procedure in chocolate processing, where special favors are developed, moisture content is decreased, and better processing properties are developed. Therefore, determination of roasting degree of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products, and it also decides the commercial value of cocoa beans collected from cocoa farmers. The roasting degree of cocoa beans currently relies on human specialists, who sometimes are biased, and chemical analysis, which take long time and are inaccessible to many manufacturers and farmers. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was used to detecting the gas generated by cocoa beans with a different roasting degree (0min, 20min, 30min, and 40min) and the signals collected by gas sensors were used to train a three-layers ANN. Chemical analysis of the graded beans was operated by traditional GC-MS system and the contents of volatile chemical compounds were used to train another ANN as a reference to electronic nosed signals trained ANN. Both trained ANN were used to predict cocoa beans with a different roasting degree for validation. The best accuracy of grading achieved by electronic nose signals trained ANN (using signals from TGS 813 826 820 880 830 2620 2602 2610) turned out to be 96.7%, however, the GC trained ANN got the accuracy of 83.8%.Keywords: artificial neutron network, cocoa bean, electronic nose, roasting
Procedia PDF Downloads 23410771 V0 Physics at LHCb. RIVET Analysis Module for Z Boson Decay to Di-Electron
Authors: A. E. Dumitriu
Abstract:
The LHCb experiment is situated at one of the four points around CERN’s Large Hadron Collider, being a single-arm forward spectrometer covering 10 mrad to 300 (250) mrad in the bending (non-bending) plane, designed primarily to study particles containing b and c quarks. Each one of LHCb’s sub-detectors specializes in measuring a different characteristic of the particles produced by colliding protons, its significant detection characteristics including a high precision tracking system and 2 ring-imaging Cherenkov detectors for particle identification. The major two topics that I am currently concerned in are: the RIVET project (Robust Independent Validation of Experiment and Theory) which is an efficient and portable tool kit of C++ class library useful for validation and tuning of Monte Carlo (MC) event generator models by providing a large collection of standard experimental analyses useful for High Energy Physics MC generator development, validation, tuning and regression testing and V0 analysis for 2013 LHCb NoBias type data (trigger on bunch + bunch crossing) at √s=2.76 TeV.Keywords: LHCb physics, RIVET plug-in, RIVET, CERN
Procedia PDF Downloads 42810770 Comparison of Bone Mineral Density of Lumbar Spines between High Level Cyclists and Sedentary
Authors: Mohammad Shabani
Abstract:
The physical activities depending on the nature of the mechanical stresses they induce on bone sometimes have brought about different results. The purpose of this study was to compare bone mineral density (BMD) of the lumbar spine between the high-level cyclists and sedentary. Materials and Methods: In the present study, 73 cyclists senior (age: 25.81 ± 4.35 years; height: 179.66 ± 6.31 cm; weight: 71.55 ± 6.31 kg) and 32 sedentary subjects (age: 28.28 ± 4.52 years; height: 176.56 ± 6.2 cm; weight: 74.47 ± 8.35 kg) participated voluntarily. All cyclists belonged to the different teams from the International Cycling Union and they trained competitively for 10 years. BMD of the lumbar spine of the subjects was measured using DXA X-ray (Lunar). Descriptive statistics calculations were performed using computer software data processing (Statview 5, SAS Institute Inc. USA). The comparison of two independent distributions (BMD high level cyclists and sedentary) was made by the Student T Test standard. Probability 0.05 (p≤0 / 05) was adopted as significance. Results: The result of this study showed that the BMD values of the lumbar spine of sedentary subjects were significantly higher for all measured segments. Conclusion and Discussion: Cycling is firstly a common sport and on the other hand endurance sport. It is now accepted that weight bearing exercises have an osteogenic effect compared to non-weight bearing exercises. Thus, endurance sports such as cycling, compared to the activities imposing intense force in short time, seem not to really be osteogenic. Therefore, it can be concluded that cycling provides low stimulates osteogenic because of specific biomechanical forces of the sport and its lack of impact.Keywords: BMD, lumbar spine, high level cyclist, cycling
Procedia PDF Downloads 26910769 Using Water Erosion Prediction Project Simulation Model for Studying Some Soil Properties in Egypt
Authors: H. A. Mansour
Abstract:
The objective of this research work is studying the water use prediction, prediction technology for water use by action agencies, and others involved in conservation, planning, and environmental assessment of the Water Erosion Prediction Project (WEPP) simulation model. Models the important physical, processes governing erosion in Egypt (climate, infiltration, runoff, ET, detachment by raindrops, detachment by flowing water, deposition, etc.). Simulation of the non-uniform slope, soils, cropping/management., and Egyptian databases for climate, soils, and crops. The study included important parameters in Egyptian conditions as follows: Water Balance & Percolation, Soil Component (Tillage impacts), Plant Growth & Residue Decomposition, Overland Flow Hydraulics. It could be concluded that we can adapt the WEPP simulation model to determining the previous important parameters under Egyptian conditions.Keywords: WEPP, adaptation, soil properties, tillage impacts, water balance, soil percolation
Procedia PDF Downloads 29710768 Enterprise Information Portal Features: Results of Content Analysis Literature Review
Authors: Michal Krčál
Abstract:
Since their introduction in 1990’s, Enterprise Information Portals (EIPs) were investigated from different perspectives (e.g. project management, technology acceptance, IS success). However, no systematic literature review was produced to systematize both the research efforts and the technology itself. This paper reports first results of an extent systematic literature review study focused on research of EIPs and its categorization, specifically it reports a conceptual model of EIP features. The previous attempt to categorize EIP features was published in 2002. For the purpose of the literature review, content of 89 articles was analyzed in order to identify and categorize features of EIPs. The methodology of the literature review was as follows. Firstly, search queries in major indexing databases (Web of Science and SCOPUS) were used. The results of queries were analyzed according to their usability for the goal of the study. Then, full-texts were coded in Atlas.ti according to previously established coding scheme. The codes were categorized and the conceptual model of EIP features was created.Keywords: enterprise information portal, content analysis, features, systematic literature review
Procedia PDF Downloads 29810767 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 16110766 Hidden Stones When Implementing Artificial Intelligence Solutions in the Engineering, Procurement, and Construction Industry
Authors: Rimma Dzhusupova, Jan Bosch, Helena Holmström Olsson
Abstract:
Artificial Intelligence (AI) in the Engineering, Procurement, and Construction (EPC) industry has not yet a proven track record in large-scale projects. Since AI solutions for industrial applications became available only recently, deployment experience and lessons learned are still to be built up. Nevertheless, AI has become an attractive technology for organizations looking to automate repetitive tasks to reduce manual work. Meanwhile, the current AI market has started offering various solutions and services. The contribution of this research is that we explore in detail the challenges and obstacles faced in developing and deploying AI in a large-scale project in the EPC industry based on real-life use cases performed in an EPC company. Those identified challenges are not linked to a specific technology or a company's know-how and, therefore, are universal. The findings in this paper aim to provide feedback to academia to reduce the gap between research and practice experience. They also help reveal the hidden stones when implementing AI solutions in the industry.Keywords: artificial intelligence, machine learning, deep learning, innovation, engineering, procurement and construction industry, AI in the EPC industry
Procedia PDF Downloads 119