|A quiet moment in the Hack Shack. Where is everybody?|
Sunday, December 10, 2017
By Kagan C
Recently, I have heard talk of the decline of the Oyster River Hack Shack. Apparently, in prior years, the Hack Shack was a bustling makerspace. It was constantly in use with tens of customers coming in on a daily basis. Their enthusiasm for the space was so great, they required quieting down frequently. This is in stark contrast to what the Hack Shack is now, a barren, desolate place with few customers. Though my descriptions are largely exaggerated and my experience with the Hack Shack is limited, I still wonder what changed?
Between the previous school year and this one, there was a sharp drop in attendance at the Hack Shack, but what caused this? This is a question I have found myself thinking about quite often, and I have found that there are a variety of possible explanations. Perhaps the current students in the class aren't putting in the same level of work as students of the past did, maybe the students who visited so often last year have graduated, or maybe maker spaces no longer hold the same appeal they once did with students. It is possible that as the types of technology seen in the Hack Shack progressed, they lost their mystique that once drew students in. The decline could easily be attributed to anyone, and likely a combination, of these reasons, however, since my time here has been so short, I do not have the experience to determine responsibility with any sort of authority.
There is another reason, however, that I do think I have the authority to so speak on: the high barrier to entry. I have spoken with numerous students about the Hack Shack, and while a majority express interest in visiting, a common issue is that many have trouble finding the time.
Some of the most exciting tools in the Hack Shack require large time commitments before students can begin seeing results. A perfect example of this is the 3D printer. It is Hack Shack policy that students can only print models that they themselves have designed, therefore, if a student wishes to 3D print something, they have to learn how to use a 3D design tool along with the 3D printer itself. These design tools range in difficulty to use. While the basics of some, such as Google Sketchup, can be learned in a little under an hour, the more powerful and comprehensive tools, such as Blendr, can be studied for weeks while barely scratching the surface. This level of commitment can be incredibly intimidating to students who may only have a passing interest in the Hack Shacks facilities. I have seen that this large investment can often turns students away from the Hack Shack. I find this disheartening as many of the great benefits of this place are being obscured by this fact, however, I find it hard to blame the students. If it were not for my enrollment in this class, it would be unlikely that I visit the Hack Shack often or take the time to learn about what it has to offer. With so much going on in a student's typical day, it can't be seen as a huge surprise that they don't want to spend their short period of free time each day mentally exerting themselves.
This is not a problem I can see being solved overnight, and furthermore, it is not one that I see going away anytime soon. If nothing within the Hack Shack changes, I would not be surprised to see this downward trend continue. As things change, the Hack Shack must adapt to change with them, and I hope that if it does so, it will begin to see success in the near future.
**Editor's note: While we have anecdotally seen a difference between this semester and last, we do not have the attendance data to support the idea of a big decline.
Tuesday, November 7, 2017
By Stephen Heirtzler
Credit to Pinmaxvr.com
Since the initial release of the Oculus Rift DK1 in March of 2013, virtual reality has experienced something of a revolution. With the ever improving capabilities of modern PCs, companies can finally explore its true potential. However, new VR headsets have been slow to release, and the high end headset market is largely controlled by just three companies: HTC, Facebook, and Sony.
This may change soon, though. Because, as of this month, a new player seems ready to enter the headset making game: Pimax Technology Co.
Pimax had previously released a 4k resolution headset to relatively minimal fanfare, but their new headset seems to go and above and beyond their own standards along with the industry standards. The headset is called The Pimax 8K and it is a game changer.
To understand why, we have to look at the two major limitations of virtual reality right now. The first limitation being field of view, and the second being resolution.
|Credit to Oculus Rift|
A headset’s field of view is dictated by the size of its lenses and the diameter of its display. The current generation of headsets have a field of view of about 110 degrees. Since the field of view of the human eye is about 210 degrees, that means that the experience of wearing a current generation headset is akin to wearing a skiing mask; large portions of your peripheral vision are obscured. This can have a negative impact on your immersion. You don’t feel quite as much like you’re “in the world” because of your limited field of view.
Arguably the most critical aspect of a VR headset’s immersiveness is its resolution. If a headset has fewer pixels to work with, the image will look noticeably jagged and blurry, dampening the illusion of looking into another world. Current generation headsets have a resolution of 1080 pixels by 1200 pixels per eye (about 2.6 million pixels). This means that small text and subtle details in a virtual environment are unreadable without getting closer. This also means that these headset suffer from something called “The Screen Door Effect” where, because of the separations between pixels, the
image appears to be being viewed through the mesh of a screen door.
|Credit to Youtube|
What makes the Pimax 8k such a game changer is it sports both a 200 degree field of view (almost equal to that of the human eye) and a staggering resolution of 3840 by 2160 pixels per eye (about 16.6 million pixels). This means that there is no screen door effect, and that the virtual world fills your visual field.
The Pimax 8k is set to ship out in January of next year, and when it does, it will provide the most immersive virtual reality experiences to date.
Sunday, November 5, 2017
Have you ever wanted to learn how to program but it just seems like there is way too much to learn? There are websites that try to make it easier to learn how to code such as code.org but they are far too simple and are aimed at kids in either middle school or younger. What if I told you that you could learn how to write a program in under five minutes? This is easily accomplished by using a more simple programming language, and the one I am going to show you is called Python. Python is known for its simplicity, however this means it is far less powerful than more complicated languages like C++ and Java.
For your first program I'm going to show you how make a program that asks for the user's name and then says “”Hi” to the user using their name.
Open Python and you should come to a screen that looks like this. This is known as the shell and this is where your program will run. Press file and then new file to open up a new program.
A screen like this should pop up. This is where you will be writing the actual code.
Use hashtags to write comments. Comments are not read by the computer and are only read by other people who read the program. Comments are important because they allow others who read your program to know what you were trying to write with your code.
Create a variable that stores the user's name. In this case the variable is called “name” and the words in the quotes are the words that the user sees.
Create a print statement that uses the user's name using the variable created previously.
Create an input statement that stops the program. Without this the program will continue to run until you close Python. After you have done this press the run button and run the program!
Once you have run the program it will ask for your name. Type your name and then press enter.
Then the print statement will pop up on the screen and it will prompt you to press enter to exit. Congratulations you have just finished your first program!!!
Sunday, October 22, 2017
Hack Shack Supervisor
The Hack Shack and all the Little Bits that have improved over my time with it
Throughout my time as a student at Oyster River High School, the Hack Shack has changed a lot from its creation in my sophomore year to a bustling and overall more well put together setting now during my senior year. At the beginning of its existence it had no reputation nor did anyone know about it. From its creation to now, it's grown a lot and so has the class that manages the space.
When I came to the high school as a freshman there was no Hack Shack so I won't talk about this year too much but I still feel it's very important to the overall story. Back when there was no Hack Shack a lot of students had very few options if they were interested in technology other than the two programming classes and the robotics team. Many teachers either had to send a workorder to IT or search out Mrs. Pearce or Mrs Stetson, Our librarian and computer Science teacher respectively, if they had a teach related problem a problem the Hack Shack greatly fixed.
At the beginning of my sophomore year Mrs. Carr and Mrs. Pearce opened the Hack Shack as our high school's very own Makerspace. Word of the new makerspace spread fast through the school newspaper, Mouth Of the River, and other websites and publications such as NHSTE. During my sophomore year, not many people completely understood what the Hack Shack was for, including the staff that ran it. At this point we depended a lot on the librarians to tell students who were having tech problems to go see the Hack Shack to solve them. Throughout the year the staff got into much more of a flow and were able to help those who came in successfully more often.
When it came time for my Junior year I was no longer part of staff but I still came in often to check up on what was going on. Some of the most incredible 3D prints were made during that year.
It was also during this year that the Hack Shack began to be used more often for projects for classes than it had in the past and to build a real reputation for itself and why it was there.
Now we have entered the 3rd year of the Hack Shack and my senior year. This year rather than being a normal staff member, I instead supervise the staff, helping them with their workshops and how to go about working in the Hack Shack. My duties also include getting information from the staff to Mrs. Pearce and Mrs. Stetson and working on a special project for the semester. At the beginning of this year, Mrs. Stetson bought a new computerized cutter called the Cricut which has been a huge upgrade to the former one, the Silhouette cutter. Because of its simplicity and more easily understandable UI, more people have been able to create designs and print them then before. The format of the work done by normal staff has also greatly changed this year increasing the transparency between the staff, me, Mrs Pearce and Mrs. Stetson.
All in all I think the Oyster River Makerspace Hack Shack has made a lot of progress and will continue to improve and grow to become even more embedded in the school culture. I hope it will continue to become someplace that everyone goes to to use the technology, learn new thing or just to enjoy themselves in a supportive place.
Wednesday, June 7, 2017
By Micah Kelly
Have you ever had a presentation that needed a little something to push it over the edge? Have you ever wanted to put more of your creative side into a tangible, easy to understand format? Well, in this tutorial, I’ll show you how to create simple, polygonal motion graphics to convey information to an audience.
Blender is one of the most flexible open source programs out there, and is readily available for download here: https://www.blender.org/
An example of the final product can be found here:
First, open Blender
Press the A hotkey to select everything, then press the X hotkey to delete everything in the default scene
Press NUM7 to go to the top view, then NUM5 to make sure that you’re in orthographic view, meaning that there’s no perspective.
Press the SHIFT+A command, then create a new camera. A camera will appear on the cursor. Make sure that the cursor is centered at the scene origin by using the SHIFT+C hotkey.
Snap to the side view by pressing the NUM3 hotkey, then move the camera along the Z axis by pressing the G hotkey to move the camera, then press the Z hotkey whilst moving the camera to snap it to the Z axis. It won’t matter how far the camera is from the origin, since it will be orthographic.
Let’s change the camera to orthographic, to do this, go to the camera options tab, then click the orthographic icon in the LENS dropdown menu. The camera is now void of perspective.
Objects and Materials
To create a new 2D object, press the SHIFT+A command, then create a new plane.
Go into edit mode, then select 3 of the vertices by holding SHIFT and right-clicking. Press the X hotkey, then delete VERTICES.
Now, we have a single vertex point. To create a shape, select the vertices, then begin extruding it into the outline of your shape. To extrude, press the E hotkey while a vertex is selected, then move your mouse, then left-click to complete the action.
To close the shape, SHIFT select the start and end vertices, then use the F hotkey. A new edge has now been created. I’ve made a potato.
To make the outline into a solid face, use the A hotkey to select all vertices, then press the F hotkey to create a face. After a new face has been created, press CTRL+T to triangulate the face into a form that Blender can understand easier.
Sometimes the mesh’s normals are flipped. This will make their lighting look abnormal. The faces will look darker than usual. To fix this, select the affected faces, press SPACE, then search for FLIP NORMALS. Click the action in the search results, then the problem should be solved.
To add a material, first switch the viewtype to material.
Go to the materials tab, then in the shading menu, click SHADELESS.
Go into edit mode for your desired model, select the all vertices, then after choosing a color, click ASSIGN. The model has now been applied with a material
Creating text in Blender is very simple. First use the SHIFT+A hotkey, then create TEXT.
Going into edit mode allows you to edit the text by typing.
The text can then be textured using the same method as the objects before.
The animation aspect is also very simple.
First press NUM0 to snap to the camera view, we’ll animate from here.
Select the desired object in object mode, then move the bar on the timeline to where the movement will start.
Create a keyframe by pressing the I hotkey, then click the LocRotScale option.
Move the bar on the timeline to where the movement will end, then make another keyframe. The object should move between the keyframes in that space of time.
Keyframes can be easily edited on the DOPE SHEET. Keyframes can be scaled around the timeline bar with S, or moved with G.
You have now created a simple 2D animation in Blender, that can be used for presentations, instructional videos, cartoons, or anything else your heart desires!
Tuesday, June 6, 2017
by Hanwen Liang
In 10 years, Google Translate has gone from supporting just a few languages to 103, connecting strangers, reaching across language barriers and even helping people find love.
Google Translate is a free translation service that provides instant translations between 103 different languages. It can translate words, sentences, and web pages between any combination of our supported languages. It was launched on April 28, 2006.
Now, with Google Translate, You can speak, snap, write or type words or sentences you want to translate to talk to someone with a different language. It can also operate on other websites or applications. It can even run offline.
Google Translate also could translate idioms, sayings, book/movie titles by its more common translation rather than translate it word-for-word like a few years ago. It could also translate sentences without major grammatical issues.
With the advancement in Artificial Intelligence, translation software also evolves.
Modeled after the way neurons connect in the human brain, deep neural networks are the same breed of AI technology that identifies commands spoken into Android phones and recognizes people in photos posted to Facebook, and the promise is that it will reinvent machine translation in much the same way. Google says that with certain languages, its new system—dubbed Google Neural Machine Translation, or GNMT—reduces errors by 60 percent.
Companies like Google are racing towards the same future—working not just to improve machine translation, but to build AI systems that can understand and respond to natural human language. As Google’s new Allo messaging app shows, these “chat bots” are still flawed. But neural networks are rapidly changing what’s possible. “None of this is solved,” Schuster says. “But there is a constant upward tick.” Or as Google says the Chinese would say: “Yǒu yīgè bùduàn xiàngshàng gōu.”
Also: A Music Video Made with Google Translate
Google Translate. " Translate." About – Google Translate. Google, n.d. Web. 17 Feb. 2017.
"Google Translate." Wikipedia. Wikimedia Foundation, 14 Feb. 2017. Web. 17 Feb. 2017.
Google. "Translate." Www.blog.google. Google, n.d. Web. 17 Feb. 2017.
Metz, Cade. "An Infusion of AI Makes Google Translate More Powerful Than Ever." Wired. Conde Nast, 27 Sept. 2016. Web. 17 Feb. 2017.
Turovsky, Barak. "Found in Translation: More Accurate, Fluent Sentences in Google Translate." Google. Google, 16 Nov. 2016. Web. 17 Feb. 2017.