Search results for: features engineering methods for forecasting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6280

Search results for: features engineering methods for forecasting

1420 Creating a Profound Sense of Comfort to Stimulate Workers’ Innovation and Productivity: Exploring Research and Case Study Applications

Authors: Rana Bazaid, Debajyoti Pati

Abstract:

Purpose: The aim of this research is to explore and discuss innovation-workspaces, and how the design of the workspace has the potential to boost the work process and encourage employees’ satisfaction, leading to inventive and creative results. Background: The relationship between the workers and the work environment has a strong potential to enhance work outcomes when optimized for work goals. Innovation-work environment can benefit employees’ satisfaction, health, and performance. To understand this complex relationship, this research explores innovation-work environments. Methods: A review of 26 peer-reviewed articles, seven books, and 23 companies’ websites was conducted; in addition, five case studies were analyzed to deduce appropriate examples for the study. Results: The research found all successful five innovation environments focused on two aspects: first, workers’ satisfaction and comfort, which includes a focus on physical, functional, and psychological comfort; second aspect, all five centers were diverse work environments that addressed workers’ needs, design for individuals and teamwork, design for workers’ freedom, and design for increasing interaction. Conclusion: understanding individuals' needs and creating work environments that enhance interaction between workers and with the space are key aspects of successful innovation-work environments.

Keywords: Innovation-workspace, productivity, work environment, workers’ satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 551
1419 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — In the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data

Keywords: Rule induction, decision table, missing data, noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431
1418 Assessment of the Efficiency of Virtual Orthodontic Consultations during COVID-19

Authors: R. Litt, A. Brown

Abstract:

Aims: We aimed to assess the efficiency of ‘Attend Anywhere’ orthodontic clinics within a district general hospital during COVID- 19. Our secondary aim was to pilot a questionnaire to assess patient satisfaction with virtual orthodontic appointments. Design: The study design is a service evaluation including pilot questionnaire. Methods: The average number of patients seen per virtual clinic and the number of patients failing to attend was compared to face-to-face clinics. The capability of virtual appointments to be successful in preventing the need for a face-to-face appointment was assessed. Patients were invited to complete a telephone pilot questionnaire focusing on patient satisfaction and accessibility. Results: There was a small increase in the number of patients failing to attend virtual appointments, with a third of the patients who did not attend failing to receive the appointment link. 81.9% of virtual clinic appointments were successful and prevented the need for a face-to-face appointment. Overall patients were very satisfied with their virtual orthodontic appointment and the majority required no assistance to access the service. Conclusions: The use of ‘Attend Anywhere’ clinics in orthodontics offers patients and clinicians an effective and efficient alternative to face-to-face appointments that patients on average find easy to use and completely satisfactory.

Keywords: Clinics, COVID-19, orthodontics, patient satisfaction, virtual.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 596
1417 A Test Methodology to Measure the Open-Loop Voltage Gain of an Operational Amplifier

Authors: Maninder Kaur Gill, Alpana Agarwal

Abstract:

It is practically not feasible to measure the open-loop voltage gain of the operational amplifier in the open loop configuration. It is because the open-loop voltage gain of the operational amplifier is very large. In order to avoid the saturation of the output voltage, a very small input should be given to operational amplifier which is not possible to be measured practically by a digital multimeter. A test circuit for measurement of open loop voltage gain of an operational amplifier has been proposed and verified using simulation tools as well as by experimental methods on breadboard. The main advantage of this test circuit is that it is simple, fast, accurate, cost effective, and easy to handle even on a breadboard. The test circuit requires only the device under test (DUT) along with resistors. This circuit has been tested for measurement of open loop voltage gain for different operational amplifiers. The underlying goal is to design testable circuits for various analog devices that are simple to realize in VLSI systems, giving accurate results and without changing the characteristics of the original system. The DUTs used are LM741CN and UA741CP. For LM741CN, the simulated gain and experimentally measured gain (average) are calculated as 89.71 dB and 87.71 dB, respectively. For UA741CP, the simulated gain and experimentally measured gain (average) are calculated as 101.15 dB and 105.15 dB, respectively. These values are found to be close to the datasheet values.

Keywords: Device under test, open-loop voltage gain, operational amplifier, test circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3286
1416 Comparison of Different Advanced Oxidation Processes for Degrading 4-Chlorophenol

Authors: M.D. Murcia, M. Gomez, E. Gomez, J.L. Gomez, N. Christofi

Abstract:

The removal efficiency of 4-chlorophenol with different advanced oxidation processes have been studied. Oxidation experiments were carried out using two 4-chlorophenol concentrations: 100 mg L-1 and 250 mg L-1 and UV generated from a KrCl excilamp with (molar ratio H2O2: 4-chlorophenol = 25:1) and without H2O2, and, with Fenton process (molar ratio H2O2:4- chlorophenol of 25:1 and Fe2+ concentration of 5 mg L-1). The results show that there is no significant difference in the 4- chlorophenol conversion when using one of the three assayed methods. However, significant concentrations of the photoproductos still remained in the media when the chosen treatment involves UV without hydrogen peroxide. Fenton process removed all the intermediate photoproducts except for the hydroquinone and the 1,2,4-trihydroxybenzene. In the case of UV and hydrogen peroxide all the intermediate photoproducts are removed. Microbial bioassays were carried out utilising the naturally luminescent bacterium Vibrio fischeri and a genetically modified Pseudomonas putida isolated from a waste treatment plant receiving phenolic waste. The results using V. fischeri show that with samples after degradation, only the UV treatment showed toxicity (IC50 =38) whereas with H2O2 and Fenton reactions the samples exhibited no toxicity after treatment in the range of concentrations studied. Using the Pseudomonas putida biosensor no toxicity could be detected for all the samples following treatment due to the higher tolerance of the organism to phenol concentrations encountered.

Keywords: 4-chlorophenol, Fenton, photodegradation, UV, excilamp.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
1415 Effect of Cooling Rate on base Metals Recovery from Copper Matte Smelting Slags

Authors: N. Tshiongo , R K.K. Mbaya , K Maweja, L.C. Tshabalala

Abstract:

Slag sample from copper smelting operation in a water jacket furnace from DRC plant was used. The study intends to determine the effect of cooling in the extraction of base metals. The cooling methods investigated were water quenching, air cooling and furnace cooling. The latter cooling ways were compared to the original as received slag. It was observed that, the cooling rate of the slag affected the leaching of base metals as it changed the phase distribution in the slag and the base metals distribution within the phases. It was also found that fast cooling of slag prevented crystallization and produced an amorphous phase that encloses the base metals. The amorphous slags from the slag dumps were more leachable in acidic medium (HNO3) which leached 46%Cu, 95% Co, 85% Zn, 92% Pb and 79% Fe with no selectivity at pH0, than in basic medium (NH4OH). The leachability was vice versa for the modified slags by quenching in water which leached 89%Cu with a high selectivity as metal extractions are less than 1% for Co, Zn, Pb and Fe at ambient temperature and pH12. For the crystallized slags, leaching of base metals increased with the increase of temperature from ambient temperature to 60°C and decreased at the higher temperature of 80°C due to the evaporation of the ammonia solution used for basic leaching, the total amounts of base metals that were leached in slow cooled slags were very low compared to the quenched slag samples.

Keywords: copper slag, leaching, amorphous, cooling rate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3731
1414 The Optimal Placement of Capacitor in Order to Reduce Losses and the Profile of Distribution Network Voltage with GA, SA

Authors: Limouzade E., Joorabian M.

Abstract:

Most of the losses in a power system relate to the distribution sector which always has been considered. From the important factors which contribute to increase losses in the distribution system is the existence of radioactive flows. The most common way to compensate the radioactive power in the system is the power to use parallel capacitors. In addition to reducing the losses, the advantages of capacitor placement are the reduction of the losses in the release peak of network capacity and improving the voltage profile. The point which should be considered in capacitor placement is the optimal placement and specification of the amount of the capacitor in order to maximize the advantages of capacitor placement. In this paper, a new technique has been offered for the placement and the specification of the amount of the constant capacitors in the radius distribution network on the basis of Genetic Algorithm (GA). The existing optimal methods for capacitor placement are mostly including those which reduce the losses and voltage profile simultaneously. But the retaliation cost and load changes have not been considered as influential UN the target function .In this article, a holistic approach has been considered for the optimal response to this problem which includes all the parameters in the distribution network: The price of the phase voltage and load changes. So, a vast inquiry is required for all the possible responses. So, in this article, we use Genetic Algorithm (GA) as the most powerful method for optimal inquiry.

Keywords: Genetic Algorithm (GA), capacitor placement, voltage profile, network losses, Simulating Annealing (SA), distribution network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1506
1413 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: Graph cuts, lung CT scan, lung parenchyma segmentation, patch based similarity metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 714
1412 Armed Groups and Intra State Conflict: A Study on the Egyptian Case

Authors: Ghzlan Mahmoud Abdel Aziz

Abstract:

This case study aims to identify the intrastate conflicts between the nation state and armed groups. Nowadays, most wars weaken states against armed groups. Thus, it is very important to negotiate with such groups in order to reinforce the law for the protection of victims. These armed groups are the cause of conflicts and they are related with many of humanitarian issues that result out of conflicts. In this age of rivalry; terrorists, insurgents, or transnational criminal parties have surfaced to the top as a reaction to these armed groups in an effort to set up a new world order. Moreover, the intra state conflicts became increasingly treacherous than the interstate conflicts, particularly when nation state systems deal with armed groups which try to influence the state. The unexpected upraising of the Arab Spring during 2011 in parts of the Middle East and North Africa formed various patterns of conflicts. The events of the Arab Spring resulted in current and long term change across the region. Significant modifications in the level, strength and period of armed conflict around the world have been made. Egypt was in the center of these events. It has fought back the armed groups under the name of terrorism and spread common disorder and violence among civilians. On this note, this study focuses on the problem of the transformation in the methods of organized violence within one state rather than between two state or more and analyzes the objectives, strategies, and internal composition of armed groups and the environments that foster them, with a focus on the Egyptian case.

Keywords: Armed groups, conflicts, Egyptian armed forces, intrastate conflicts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305
1411 Fabrication of Nanoporous Template of Aluminum Oxide with High Regularity Using Hard Anodization Method

Authors: Hamed Rezazadeh, Majid Ebrahimzadeh, Mohammad Reza Zeidi Yam

Abstract:

Anodizing is an electrochemical process that converts the metal surface into a decorative, durable, corrosion-resistant, anodic oxide finish. Aluminum is ideally suited to anodizing, although other nonferrous metals, such as magnesium and titanium, also can be anodized. The anodic oxide structure originates from the aluminum substrate and is composed entirely of aluminum oxide. This aluminum oxide is not applied to the surface like paint or plating, but is fully integrated with the underlying aluminum substrate, so cannot chip or peel. It has a highly ordered, porous structure that allows for secondary processes such as coloring and sealing. In this experimental paper, we focus on a reliable method for fabricating nanoporous alumina with high regularity. Starting from study of nanostructure materials synthesize methods. After that, porous alumina fabricate in the laboratory by anodization of aluminum oxide. Hard anodization processes are employed to fabricate the nanoporous alumina using 0.3M oxalic acid and 90, 120 and 140 anodized voltages. The nanoporous templates were characterized by SEM and FFT. The nanoporous templates using 140 voltages have high ordered. The pore formation, influence of the experimental conditions on the pore formation, the structural characteristics of the pore and the oxide chemical reactions involved in the pore growth are discuss.

Keywords: Alumina, Nanoporous Template, Anodization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2840
1410 Delineation of Oil – Polluted Sites in Ibeno LGA, Nigeria, Using Geophysical Techniques

Authors: Ime R. Udotong, Justina I. R. Udotong, Ofonime U. M. John

Abstract:

Ibeno, Nigeria hosts the operational base of Mobil Producing Nigeria Unlimited (MPNU), a subsidiary of ExxonMobil and the current highest oil & condensate producer in Nigeria. Besides MPNU, other oil companies operate onshore, on the continental shelf and deep offshore of the Atlantic Ocean in Ibeno, Nigeria. This study was designed to delineate oil polluted sites in Ibeno, Nigeria using geophysical methods of electrical resistivity (ER) and ground penetrating radar (GPR). Results obtained revealed that there have been hydrocarbon contaminations of this environment by past crude oil spills as observed from high resistivity values and GPR profiles which clearly show the distribution, thickness and lateral extent of hydrocarbon contamination as represented on the radargram reflector tones. Contaminations were of varying degrees, ranging from slight to high, indicating levels of substantial attenuation of crude oil contamination over time. Moreover, the display of relatively lower resistivities of locations outside the impacted areas compared to resistivity values within the impacted areas and the 3-D Cartesian images of oil contaminant plume depicted by red, light brown and magenta for high, low and very low oil impacted areas, respectively confirmed significant recent pollution of the study area with crude oil.

Keywords: Electrical resistivity, geophysical investigations, ground penetrating radar, oil-polluted sites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3058
1409 Applying Element Free Galerkin Method on Beam and Plate

Authors: Mahdad M’hamed, Belaidi Idir

Abstract:

This paper develops a meshless approach, called Element Free Galerkin (EFG) method, which is based on the weak form Moving Least Squares (MLS) of the partial differential governing equations and employs the interpolation to construct the meshless shape functions. The variation weak form is used in the EFG where the trial and test functions are approximated bye the MLS approximation. Since the shape functions constructed by this discretization have the weight function property based on the randomly distributed points, the essential boundary conditions can be implemented easily. The local weak form of the partial differential governing equations is obtained by the weighted residual method within the simple local quadrature domain. The spline function with high continuity is used as the weight function. The presently developed EFG method is a truly meshless method, as it does not require the mesh, either for the construction of the shape functions, or for the integration of the local weak form. Several numerical examples of two-dimensional static structural analysis are presented to illustrate the performance of the present EFG method. They show that the EFG method is highly efficient for the implementation and highly accurate for the computation. The present method is used to analyze the static deflection of beams and plate hole

Keywords: Numerical computation, element-free Galerkin, moving least squares, meshless methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2401
1408 Bridging Stress Modeling of Composite Materials Reinforced by Fibers Using Discrete Element Method

Authors: Chong Wang, Kellem M. Soares, Luis E. Kosteski

Abstract:

The problem of toughening in brittle materials reinforced by fibers is complex, involving all of the mechanical properties of fibers, matrix and the fiber/matrix interface, as well as the geometry of the fiber. Development of new numerical methods appropriate to toughening simulation and analysis is necessary. In this work, we have performed simulations and analysis of toughening in brittle matrix reinforced by randomly distributed fibers by means of the discrete elements method. At first, we put forward a mechanical model of toughening contributed by random fibers. Then with a numerical program, we investigated the stress, damage and bridging force in the composite material when a crack appeared in the brittle matrix. From the results obtained, we conclude that: (i) fibers of high strength and low elasticity modulus are beneficial to toughening; (ii) fibers of relatively high elastic modulus compared to the matrix may result in substantial matrix damage due to spalling effect; (iii) employment of high-strength synthetic fibers is a good option for toughening. We expect that the combination of the discrete element method (DEM) with the finite element method (FEM) can increase the versatility and efficiency of the software developed. The present work can guide the design of ceramic composites of high performance through the optimization of the parameters.

Keywords: Bridging stress, discrete element method, fiber reinforced composites, toughening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
1407 Technology Based Learning Environment and Student Achievement in English as a Foreign Language in Pakistan

Authors: M. Athar Hussain, M. Zafar Iqbal., M. Saeed Akhtar

Abstract:

The fast growing accessibility and capability of emerging technologies have fashioned enormous possibilities of designing, developing and implementing innovative teaching methods in the classroom. The global technological scenario has paved the way to new pedagogies in teaching-learning process focusing on technology based learning environment and its impact on student achievement. The present experimental study was conducted to determine the effectiveness of technology based learning environment on student achievement in English as a foreign language. The sample of the study was 90 students of 10th grade of a public school located in Islamabad. A pretest- posttest equivalent group design was used to compare the achievement of the two groups. A Pretest and A posttest containing 50 items each from English textbook were developed and administered. The collected data were statistically analyzed. The results showed that there was a significant difference between the mean scores of Experimental group and the Control group. The performance of Experimental group was better on posttest scores that indicted that teaching through technology based learning environment enhanced the achievement level of the students. On the basis of the results, it was recommended that teaching and learning through information and communication technologies may be adopted to enhance the language learning capability of the students.

Keywords: English as a Foreign Language, Student Achievement, Technology Based Learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3097
1406 Overview of Multi-Chip Alternatives for 2.5D and 3D Integrated Circuit Packagings

Authors: Ching-Feng Chen, Ching-Chih Tsai

Abstract:

With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to such issues of the short channel effect and the development of the high numerical aperture (NA) lithography equipment. In the context of the ever-increasing technical requirements of portable devices and high-performance computing (HPC), relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (IC) based on the updated transistor structure and technology nodes. We conclude that multi-chip solutions for 2.5D and 3D IC packaging can prolong Moore’s Law.

Keywords: Moore’s Law, High Numerical Aperture, Power Consumption-Performance-Area-Cost-Cycle Time to Market, PPACC, 2.5 and 3D-Very-Large-Scale Integration Packaging, Through Silicon Vi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175
1405 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: J. Madureira, R. Lagido, I. Sousa

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU. 

Keywords: Inertial Measurement Unit (IMU), Global Positioning System (GPS), smartphone, surfing performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
1404 Material Analysis for Temple Painting Conservation in Taiwan

Authors: Chen-Fu Wang, Lin-Ya Kung

Abstract:

For traditional painting materials, the artisan used to combine the pigments with different binders to create colors. As time goes by, the materials used for painting evolved from natural to chemical materials. The vast variety of ingredients used in chemical materials has complicated restoration work; it makes conservation work more difficult. Conservation work also becomes harder when the materials cannot be easily identified; therefore, it is essential that we take a more scientific approach to assist in conservation work. Paintings materials are high molecular weight polymer, and their analysis is very complicated as well other contamination such as smoke and dirt can also interfere with the analysis of the material. The current methods of composition analysis of painting materials include Fourier transform infrared spectroscopy (FT-IR), mass spectrometer, Raman spectroscopy, X-ray diffraction spectroscopy (XRD), each of which has its own limitation. In this study, FT-IR was used to analyze the components of the paint coating. We have taken the most commonly seen materials as samples and deteriorated it. The aged information was then used for the database to exam the temple painting materials. By observing the FT-IR changes over time, we can tell all of the painting materials will be deteriorated by the UV light, but only the speed of its degradation had some difference. From the deterioration experiment, the acrylic resin resists better than the others. After collecting the painting materials aging information on FT-IR, we performed some test on the paintings on the temples. It was found that most of the artisan used tune-oil for painting materials, and some other paintings used chemical materials. This method is now working successfully on identifying the painting materials. However, the method is destructive and high cost. In the future, we will work on the how to know the painting materials more efficiently.

Keywords: Temple painting, painting material, conservation, FT-IR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1253
1403 Methodology of Personalizing Interior Spaces in Public Libraries

Authors: Baharak Mousapour

Abstract:

Creating public spaces which are tailored for the specific demands of the individuals is one of the challenges for the contemporary interior designers. Improving the general knowledge as well as providing a forum for all walks of life to exploit is one of the objectives of a public library. In this regard, interior design in consistent with the demands of the individuals is of paramount importance. Seemingly, study spaces, in particular, those in close relation to the personalized sector, have proven to be challenging, according to the literature. To address this challenge, attributes of individuals, namely, perception of people from public spaces and their interactions with the so-called spaces, should be analyzed to provide interior designers with something to work on. This paper follows the analytic-descriptive research methodology by outlining case study libraries which have personalized public libraries with the investigation of the type of personalization as its primary objective and (I) recognition of physical schedule and the know-how of the spatial connection in indoor design of a library and (II) analysis of each personalized space in relation to other spaces of the library as its secondary objectives. The significance of the current research lies in the concept of personalization as one of the most recent methods of attracting people to libraries. Previous research exists in this regard, but the lack of data concerning personalization makes this topic worth investigating. Hence, this study aims to put forward approaches through real-case studies for the designers to deal with this concept.

Keywords: interior design, library, library design, personalization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 567
1402 Performance Management of Tangible Assets within the Balanced Scorecard and Interactive Business Decision Tools

Authors: Raymond K. Jonkers

Abstract:

The present study investigated approaches and techniques to enhance strategic management governance and decision making within the framework of a performance-based balanced scorecard. The review of best practices from strategic, program, process, and systems engineering management provided for a holistic approach toward effective outcome-based capability management. One technique, based on factorial experimental design methods, was used to develop an empirical model. This model predicted the degree of capability effectiveness and is dependent on controlled system input variables and their weightings. These variables represent business performance measures, captured within a strategic balanced scorecard. The weighting of these measures enhances the ability to quantify causal relationships within balanced scorecard strategy maps. The focus in this study was on the performance of tangible assets within the scorecard rather than the traditional approach of assessing performance of intangible assets such as knowledge and technology. Tangible assets are represented in this study as physical systems, which may be thought of as being aboard a ship or within a production facility. The measures assigned to these systems include project funding for upgrades against demand, system certifications achieved against those required, preventive maintenance to corrective maintenance ratios, and material support personnel capacity against that required for supporting respective systems. The resultant scorecard is viewed as complimentary to the traditional balanced scorecard for program and performance management. The benefits from these scorecards are realized through the quantified state of operational capabilities or outcomes. These capabilities are also weighted in terms of priority for each distinct system measure and aggregated and visualized in terms of overall state of capabilities achieved. This study proposes the use of interactive controls within the scorecard as a technique to enhance development of alternative solutions in decision making. These interactive controls include those for assigning capability priorities and for adjusting system performance measures, thus providing for what-if scenarios and options in strategic decision-making. In this holistic approach to capability management, several cross functional processes were highlighted as relevant amongst the different management disciplines. In terms of assessing an organization’s ability to adopt this approach, consideration was given to the P3M3 management maturity model.

Keywords: Outcome based management, performance management, lifecycle costs, balanced scorecard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332
1401 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses

Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau

Abstract:

Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.

Keywords: Exam length, psychometric criteria, synthetic experimental designs, test length.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
1400 Numerical Simulation for a Shallow Braced Excavation of Campus Building

Authors: Sao-Jeng Chao, Wen-Cheng Chen, Wei-Humg Lu

Abstract:

In order to prevent encountering unpredictable factors, geotechnical engineers always conduct numerical analysis for braced excavation design. Simulation work in advance can predict the response of subsequent excavation and thus will be designed to increase the security coefficient of construction. The parameters that are considered include geological conditions, soil properties, soil distributions, loading types, and the analysis and design methods. National Ilan University is located on the LanYang plain, mainly deposited by clayey soil and loose sand, and thus is vulnerable to external influence displacement. National Ilan University experienced a construction of braced excavation with a complete program of monitoring excavation. This study takes advantage of a one-dimensional finite element method RIDO to simulate the excavation process. The predicted results from numerical simulation analysis are compared with the monitored results of construction to explore the differences between them. Numerical simulation analysis of the excavation process can be used to analyze retaining structures for the purpose of understanding the relationship between the displacement and supporting system. The resulting deformation and stress distribution from the braced excavation cab then be understand in advance. The problems can be prevented prior to the construction process, and thus acquire all the affected important factors during design and construction.

Keywords: Excavation, numerical simulation, rido, retaining structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893
1399 Forensic Science in Dr. Jekyll and Mr. Hyde: Trails of Utterson's Quest

Authors: Kyu-Jeoung Lee, Jae-Uk Choo

Abstract:

This paper focuses on investigating The Strange Case of Dr Jekyll and Mr Hyde from Utterson’s point of view, referring to: Gabriel John Utterson, a central character in the book. Utterson is no different from a forensic investigator, as he tries to collect evidence on the mysterious Mr. Hyde’s relationship to Dr. Jekyll. From Utterson's perspective, Jekyll is the 'victim' of a potential scandal and blackmail, and Hyde is the 'suspect' of a possible 'crime'. Utterson intends to figure out Hyde's identity, connect his motive with his actions, and gather witness accounts. During Utterson’s quest, the outside materials available to him along with the social backgrounds of Hyde and Jekyll will be analyzed. The archives left from Jekyll’s chamber will also play a part providing evidence. Utterson will investigate, based on what he already knows about Jekyll his whole life, and how Jekyll had acted in his eyes until he was gone, and finding out possible explanations for Jekyll's actions. The relationship between Jekyll and Hyde becomes the major question, as the social background offers clues pointing in the direction of illegitimacy and prostitution. There is still a possibility that Jekyll and Hyde were, in fact, completely different people. Utterson received a full statement and confession from Jekyll himself at the end of the story, which gives the reader the possible truth on what happened. Stevenson’s Dr. Jekyll and Mr. Hyde led readers, as it did Utterson, to find the connection between Hyde and Jekyll using methods of history, culture, and science. Utterson's quest to uncover Hyde shows an example of applying the various fields to in his act to see if Hyde's inheritance was legal. All of this taken together could technically be considered forensic investigation.

Keywords: Dr. Jekyll and Mr. Hyde, forensic investigation, illegitimacy, prostitution, Robert Louis Stevenson.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
1398 A Bibliometric Assessment on Sustainability and Clustering

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner, David Gabriel F. de Barros

Abstract:

Review researches are useful in terms of analysis of research problems. Between the types of review documents, we commonly find bibliometric studies. This type of application often helps the global visualization of a research problem and helps academics worldwide to understand the context of a research area better. In this document, a bibliometric view surrounding clustering techniques and sustainability problems is presented. The authors aimed at which issues mostly use clustering techniques and even which sustainability issue would be more impactful on today’s moment of research. During the bibliometric analysis, we found 10 different groups of research in clustering applications for sustainability issues: Energy; Environmental; Non-urban Planning; Sustainable Development; Sustainable Supply Chain; Transport; Urban Planning; Water; Waste Disposal; and, Others. Moreover, by analyzing the citations of each group, it was discovered that the Environmental group could be classified as the most impactful research cluster in the area mentioned. After the content analysis of each paper classified in the environmental group, it was found that the k-means technique is preferred for solving sustainability problems with clustering methods since it appeared the most amongst the documents. The authors finally conclude that a bibliometric assessment could help indicate a gap of researches on waste disposal – which was the group with the least amount of publications – and the most impactful research on environmental problems.

Keywords: Bibliometric assessment, clustering, sustainability, territorial partitioning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 345
1397 A Comparative Study of Global Power Grids and Global Fossil Energy Pipelines Using GIS Technology

Authors: Wenhao Wang, Xinzhi Xu, Limin Feng, Wei Cong

Abstract:

This paper comprehensively investigates current development status of global power grids and fossil energy pipelines (oil and natural gas), proposes a standard visual platform of global power and fossil energy based on Geographic Information System (GIS) technology. In this visual platform, a series of systematic visual models is proposed with global spatial data, systematic energy and power parameters. Under this visual platform, the current Global Power Grids Map and Global Fossil Energy Pipelines Map are plotted within more than 140 countries and regions across the world. Using the multi-scale fusion data processing and modeling methods, the world’s global fossil energy pipelines and power grids information system basic database is established, which provides important data supporting global fossil energy and electricity research. Finally, through the systematic and comparative study of global fossil energy pipelines and global power grids, the general status of global fossil energy and electricity development are reviewed, and energy transition in key areas are evaluated and analyzed. Through the comparison analysis of fossil energy and clean energy, the direction of relevant research is pointed out for clean development and energy transition.

Keywords: Energy Transition, geographic information system, fossil energy, power systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 932
1396 GC and GCxGC-MS Composition of Volatile Compounds from Carum carvi by Using Techniques Assisted by Microwaves

Authors: F. Benkaci-Ali, R. Mékaoui, G. Scholl, G. Eppe

Abstract:

The new methods as accelerated steam distillation assisted by microwave (ASDAM) is a combination of microwave heating and steam distillation, performed at atmospheric pressure at very short extraction time. Isolation and concentration of volatile compounds are performed by a single stage. (ASDAM) has been compared with (ASDAM) with cryogrinding of seeds (CG) and a conventional technique, hydrodistillation assisted by microwave (HDAM), hydro-distillation (HD) for the extraction of essential oil from aromatic herb as caraway and cumin seeds. The essential oils extracted by (ASDAM) for 1 min were quantitatively (yield) and qualitatively (aromatic profile) no similar to those obtained by ASDAM-CG (1 min) and HD (for 3 h). The accelerated microwave extraction with cryogrinding inhibits numerous enzymatic reactions as hydrolysis of oils. Microwave radiations constitute the adequate mean for the extraction operations from the yields and high content in major component majority point view, and allow to minimise considerably the energy consumption, but especially heating time too, which is one of essential parameters of artifacts formation. The ASDAM and ASDAM-CG are green techniques and yields an essential oil with higher amounts of more valuable oxygenated compounds comparable to the biosynthesis compounds, and allows substantial savings of costs, in terms of time, energy and plant material.

Keywords: Microwave, steam distillation, caraway, cumin, cryogrinding, GC-MS, GCxGC-MS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2003
1395 Investigation of Different Stimulation Patterns to Reduce Muscle Fatigue during Functional Electrical Stimulation

Authors: R. Ruslee, H. Gollee

Abstract:

Functional electrical stimulation (FES) is a commonly used technique in rehabilitation and often associated with rapid muscle fatigue which becomes the limiting factor in its applications. The objective of this study is to investigate the effects on the onset of fatigue of conventional synchronous stimulation, as well as asynchronous stimulation that mimic voluntary muscle activation targeting different motor units which are activated sequentially or randomly via multiple pairs of stimulation electrodes. We investigate three different approaches with various electrode configurations, as well as different patterns of stimulation applied to the gastrocnemius muscle: Conventional Synchronous Stimulation (CSS), Asynchronous Sequential Stimulation (ASS) and Asynchronous Random Stimulation (ARS). Stimulation was applied repeatedly for 300 ms followed by 700 ms of no-stimulation with 40 Hz effective frequency for all protocols. Ten able-bodied volunteers (28±3 years old) participated in this study. As fatigue indicators, we focused on the analysis of Normalized Fatigue Index (NFI), Fatigue Time Interval (FTI) and pre-post Twitch-Tetanus Ratio (ΔTTR). The results demonstrated that ASS and ARS give higher NFI and longer FTI confirming less fatigue for asynchronous stimulation. In addition, ASS and ARS resulted in higher ΔTTR than conventional CSS. In this study, we proposed a randomly distributed stimulation method for the application of FES and investigated its suitability for reducing muscle fatigue compared to previously applied methods. The results validated that asynchronous stimulation reduces fatigue, and indicates that random stimulation may improve fatigue resistance in some conditions.

Keywords: Asynchronous stimulation, electrode configuration, functional electrical stimulation, muscle fatigue, pattern stimulation, random stimulation, sequential stimulation, synchronous stimulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1222
1394 Comparison of Current Chinese and Japanese Design Specification for Bridge Pile in Liquefied Ground

Authors: Baydaa H. Maula, Ling Zhang, Tang Liang, Gao Xia, Xu Peng-Ju, Zhang Yong-Qiang, Kang Jie, Su Lei

Abstract:

Firstly, this study briefly presents the current situation that there exists a vast gap between current Chinese and Japanese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground; The Chinese and Japanese seismic design method and technical detail for bridge pile foundation in liquefying and lateral spreading ground are described and compared systematically and comprehensively, the methods of determining coefficient of subgrade reaction and its reduction factor as well as the computing mode of the applied force on pile foundation due to liquefaction-induced lateral spreading soil in Japanese design specification are especially introduced. Subsequently, the comparison indicates that the content of Chinese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground, just presenting some qualitative items, is too general and lacks systematicness and maneuverability. Finally, some defects of seismic design specification in China are summarized, so the improvement and revision of specification in the field turns out to be imperative for China, some key problems of current Chinese specifications are generalized and the corresponding improvement suggestions are proposed.

Keywords: liquefying soil, laterally spreading ground, seismic design specification for bridge pile foundation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3616
1393 A New Fast Skin Color Detection Technique

Authors: Tarek M. Mahmoud

Abstract:

Skin color can provide a useful and robust cue for human-related image analysis, such as face detection, pornographic image filtering, hand detection and tracking, people retrieval in databases and Internet, etc. The major problem of such kinds of skin color detection algorithms is that it is time consuming and hence cannot be applied to a real time system. To overcome this problem, we introduce a new fast technique for skin detection which can be applied in a real time system. In this technique, instead of testing each image pixel to label it as skin or non-skin (as in classic techniques), we skip a set of pixels. The reason of the skipping process is the high probability that neighbors of the skin color pixels are also skin pixels, especially in adult images and vise versa. The proposed method can rapidly detect skin and non-skin color pixels, which in turn dramatically reduce the CPU time required for the protection process. Since many fast detection techniques are based on image resizing, we apply our proposed pixel skipping technique with image resizing to obtain better results. The performance evaluation of the proposed skipping and hybrid techniques in terms of the measured CPU time is presented. Experimental results demonstrate that the proposed methods achieve better result than the relevant classic method.

Keywords: Adult images filtering, image resizing, skin color detection, YcbCr color space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3953
1392 A Review on Factors Influencing Implementation of Secure Software Development Practices

Authors: Sri Lakshmi Kanniah, Mohd Naz’ri Mahrin

Abstract:

More and more businesses and services are depending on software to run their daily operations and business services. At the same time, cyber-attacks are becoming more covert and sophisticated, posing threats to software. Vulnerabilities exist in the software due to the lack of security practices during the phases of software development. Implementation of secure software development practices can improve the resistance to attacks. Many methods, models and standards for secure software development have been developed. However, despite the efforts, they still come up against difficulties in their deployment and the processes are not institutionalized. There is a set of factors that influence the successful deployment of secure software development processes. In this study, the methodology and results from a systematic literature review of factors influencing the implementation of secure software development practices is described. A total of 44 primary studies were analysed as a result of the systematic review. As a result of the study, a list of twenty factors has been identified. Some of factors that affect implementation of secure software development practices are: Involvement of the security expert, integration between security and development team, developer’s skill and expertise, development time and communication between stakeholders. The factors were further classified into four categories which are institutional context, people and action, project content and system development process. The results obtained show that it is important to take into account organizational, technical and people issues in order to implement secure software development initiatives.

Keywords: Secure software development, software development, software security, systematic literature review.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2442
1391 3D Rendering of American Sign Language Finger-Spelling: A Comparative Study of Two Animation Techniques

Authors: Nicoletta Adamo-Villani

Abstract:

In this paper we report a study aimed at determining the most effective animation technique for representing ASL (American Sign Language) finger-spelling. Specifically, in the study we compare two commonly used 3D computer animation methods (keyframe animation and motion capture) in order to ascertain which technique produces the most 'accurate', 'readable', and 'close to actual signing' (i.e. realistic) rendering of ASL finger-spelling. To accomplish this goal we have developed 20 animated clips of fingerspelled words and we have designed an experiment consisting of a web survey with rating questions. 71 subjects ages 19-45 participated in the study. Results showed that recognition of the words was correlated with the method used to animate the signs. In particular, keyframe technique produced the most accurate representation of the signs (i.e., participants were more likely to identify the words correctly in keyframed sequences rather than in motion captured ones). Further, findings showed that the animation method had an effect on the reported scores for readability and closeness to actual signing; the estimated marginal mean readability and closeness was greater for keyframed signs than for motion captured signs. To our knowledge, this is the first study aimed at measuring and comparing accuracy, readability and realism of ASL animations produced with different techniques.

Keywords: 3D Animation, American Sign Language, DeafEducation, Motion Capture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975